Skip to content

storage_location_mixin

The storage_location_mixin module provides two mixins for managing storage locations on entities (Projects and Folders):

  • StorageLocationConfigurable — base mixin providing STS credentials and file migration.
  • ProjectSettingsMixin — extends StorageLocationConfigurable with project-level storage settings.

For architecture diagrams and design documentation, see Storage Location Architecture.

Methods Overview

StorageLocationConfigurable

Method Description
get_sts_storage_token Get STS credentials for direct S3 access
index_files_for_migration Index files for migration to a new storage location
migrate_indexed_files Migrate previously indexed files

ProjectSettingsMixin (extends StorageLocationConfigurable)

Method Description
set_storage_location Set the upload storage location for this entity (destructive replace)
get_project_setting Get project settings (upload, external_sync, etc.)
delete_project_setting Delete a project setting

All methods have async equivalents with an _async suffix (e.g. get_sts_storage_token_async).

Migration Workflow

Migration is a two-step process:

  1. Index — call index_files_for_migration to scan the entity and record files to migrate in a local SQLite database.
  2. Migrate — call migrate_indexed_files with the database path to perform the actual copy.

index_files_for_migration parameters

Parameter Default Description
dest_storage_location_id required Destination storage location ID
db_path None Path for the SQLite tracking database; a temp path is used if omitted
source_storage_location_ids None Restrict to files in these source locations; None means all locations
file_version_strategy "new" "new" / "all" / "latest" / "skip"
include_table_files False Whether to include files attached to tables
continue_on_error False Record errors and continue rather than raising

Returns a MigrationResult — access result.db_path to pass to the next step.

migrate_indexed_files parameters

Parameter Default Description
db_path required Path returned by index_files_for_migration
create_table_snapshots True Create table snapshots before migrating table files
continue_on_error False Record errors and continue rather than raising
force False Skip the interactive confirmation prompt (required for non-interactive/CI use)

Returns a MigrationResult, or None if migration was aborted (user declined the prompt, or the session is non-interactive and force=False).

Usage Examples

Set a storage location

from synapseclient.models import Folder

# Replace all storage locations on a folder
folder = Folder(id="syn123").get()
folder.set_storage_location(storage_location_id=12345)

# Add a storage location without removing existing ones
setting = folder.get_project_setting(setting_type="upload")
if setting:
    setting.locations.append(67890)
    setting.store()

Get STS credentials

Note: Entity must have an STS-enabled storage location

credentials = folder.get_sts_storage_token(
    permission="read_write",
    output_format="boto",
)

Migrate files to a new storage location

import asyncio
from synapseclient import Synapse
from synapseclient.models import Project

syn = Synapse()
syn.login()

async def main():
    project = await Project(id="syn123").get_async()

    # Step 1: index
    index_result = await project.index_files_for_migration_async(
        dest_storage_location_id=12345,
    )
    print(f"Database path: {index_result.db_path}")

    # Step 2: migrate (force=True for non-interactive scripts)
    result = await project.migrate_indexed_files_async(
        db_path=index_result.db_path,
        force=True,
    )
    print(result.counts_by_status)

asyncio.run(main())

synapseclient.models.mixins.StorageLocationConfigurable

Bases: StorageLocationConfigurableSynchronousProtocol

Mixin for objects that can have their storage location configured.

In order to use this mixin, the class must have an id attribute.

This mixin provides methods for: - Getting STS (AWS Security Token Service) credentials for direct S3 access - Migrating files to a new storage location

Source code in synapseclient/models/mixins/storage_location_mixin.py
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
@async_to_sync
class StorageLocationConfigurable(StorageLocationConfigurableSynchronousProtocol):
    """Mixin for objects that can have their storage location configured.

    In order to use this mixin, the class must have an `id` attribute.

    This mixin provides methods for:
    - Getting STS (AWS Security Token Service) credentials for direct S3 access
    - Migrating files to a new storage location
    """

    id: Optional[str] = None
    """The unique immutable ID for this entity."""

    @otel_trace_method(
        method_to_trace_name=lambda self, **kwargs: f"Entity_GetStsStorageToken: {self.id}"
    )
    async def get_sts_storage_token_async(
        self,
        permission: str,
        *,
        output_format: str = "json",
        min_remaining_life: Optional[int] = None,
        synapse_client: Optional[Synapse] = None,
    ) -> Any:
        """Get STS (AWS Security Token Service) credentials for direct access to
        the storage location backing this entity. These credentials can be used
        with AWS tools like awscli and boto3.
        Note: The entity must use a storage location that has STS enabled.

        Arguments:
            permission: The permission level for the token. Must be 'read_only'
                or 'read_write'.
            output_format: The output format for the credentials. Options:
                'json' (default), 'boto', 'shell', 'bash', 'cmd', 'powershell'.
            min_remaining_life: The minimum remaining life (in seconds) for a
                cached token before a new one is fetched.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The STS credentials in the requested format.

        Raises:
            ValueError: If the entity does not have an id set.

        Example: Using credentials with boto3
            Get STS credentials for an STS-enabled folder and use with boto3:

                import asyncio
                import boto3
                from synapseclient import Synapse
                from synapseclient.models import Folder

                syn = Synapse()
                syn.login()

                async def main():
                    folder = await Folder(id="syn123").get_async()
                    credentials = await folder.get_sts_storage_token_async(
                        permission="read_write",
                        output_format="boto",
                    )
                    s3_client = boto3.client('s3', **credentials)

                asyncio.run(main())
        """
        if not self.id:
            raise ValueError("The entity must have an id set.")

        from synapseclient.core import sts_transfer

        client = Synapse.get_client(synapse_client=synapse_client)

        return await asyncio.to_thread(
            sts_transfer.get_sts_credentials,
            client,
            self.id,
            permission,
            output_format=output_format,
            min_remaining_life=min_remaining_life,
        )

    @otel_trace_method(
        method_to_trace_name=lambda self, **kwargs: f"Entity_IndexFilesForMigration: {self.id}"
    )
    async def index_files_for_migration_async(
        self,
        dest_storage_location_id: int,
        db_path: Optional[str] = None,
        *,
        source_storage_location_ids: Optional[List[int]] = None,
        file_version_strategy: str = "new",
        include_table_files: bool = False,
        continue_on_error: bool = False,
        synapse_client: Optional[Synapse] = None,
    ) -> MigrationResult:
        """Index files in this entity for migration to a new storage location.

        This is the first step in migrating files to a new storage location.
        After indexing, use `migrate_indexed_files` to perform the actual migration.

        Arguments:
            dest_storage_location_id: The destination storage location ID.
            db_path: Path to the SQLite database file for tracking migration state.
                If not provided, a temporary directory will be used. The path
                can be retrieved from the returned MigrationResult.db_path.
            source_storage_location_ids: Optional list of source storage location IDs
                to filter which files to migrate. If None, all files are indexed.
            file_version_strategy: Strategy for handling file versions. Options:
                'new' (default) - create new versions, 'all' - migrate all versions,
                'latest' - only migrate latest version, 'skip' - skip if file exists.
            include_table_files: Whether to include files attached to tables.
            continue_on_error: Whether to continue indexing if an error occurs.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            A MigrationResult object containing indexing statistics and the database
            path (accessible via result.db_path).

        Example: Indexing files for migration
            Index files in a project for migration:

                import asyncio
                from synapseclient import Synapse
                from synapseclient.models import Project

                syn = Synapse()
                syn.login()

                async def main():
                    project = await Project(id="syn123").get_async()
                    result = await project.index_files_for_migration_async(
                        dest_storage_location_id=12345,
                    )
                    print(f"Database path: {result.db_path}")
                    print(f"Indexed {result.counts_by_status}")

                asyncio.run(main())
        """
        if not self.id:
            raise ValueError("The entity must have an id set.")

        return await _index_files_for_migration_async(
            self,
            dest_storage_location_id=str(dest_storage_location_id),
            db_path=db_path,
            source_storage_location_ids=(
                [str(s) for s in source_storage_location_ids]
                if source_storage_location_ids
                else None
            ),
            file_version_strategy=file_version_strategy,
            include_table_files=include_table_files,
            continue_on_error=continue_on_error,
            synapse_client=synapse_client,
        )

    @otel_trace_method(
        method_to_trace_name=lambda self, **kwargs: f"Entity_MigrateIndexedFiles: {self.id}"
    )
    async def migrate_indexed_files_async(
        self,
        db_path: str,
        *,
        create_table_snapshots: bool = True,
        continue_on_error: bool = False,
        force: bool = False,
        synapse_client: Optional[Synapse] = None,
    ) -> Optional[MigrationResult]:
        """Migrate files that have been indexed with `index_files_for_migration`.

        This is the second step in migrating files to a new storage location.
        Files must first be indexed using `index_files_for_migration`.

        **Interactive confirmation:** When called from an interactive shell and
        ``force=False`` (the default), this method will print the number of items
        queued for migration and prompt for confirmation before proceeding. If
        standard output is not connected to an interactive terminal (e.g. a script
        or CI environment), migration is aborted unless ``force=True`` is set.

        Arguments:
            db_path: Path to the SQLite database file created by
                `index_files_for_migration`. You can get this from the
                MigrationResult.db_path returned by index_files_for_migration.
            create_table_snapshots: Whether to create table snapshots before
                migrating table files.
            continue_on_error: Whether to continue migration if an error occurs.
            force: Skip the interactive confirmation prompt and proceed with
                migration automatically. Set to ``True`` when running
                non-interactively (scripts, CI, automated pipelines).
                Defaults to False.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            A MigrationResult object containing migration statistics, or None
            if migration was aborted (user declined the confirmation prompt, or
            the session is non-interactive and force=False).

        Example: Migrating indexed files
            Migrate previously indexed files:

                import asyncio
                from synapseclient import Synapse
                from synapseclient.models import Project

                syn = Synapse()
                syn.login()

                async def main():
                    project = await Project(id="syn123").get_async()

                    # Index first
                    index_result = await project.index_files_for_migration_async(
                        dest_storage_location_id=12345,
                    )

                    # Then migrate using the db_path from index result
                    result = await project.migrate_indexed_files_async(
                        db_path=index_result.db_path,
                        force=True,  # Skip interactive confirmation
                    )
                    print(f"Migrated {result.counts_by_status}")

                asyncio.run(main())
        """
        if not self.id:
            raise ValueError("The entity must have an id set.")

        return await _migrate_indexed_files_async(
            db_path=db_path,
            create_table_snapshots=create_table_snapshots,
            continue_on_error=continue_on_error,
            force=force,
            synapse_client=synapse_client,
        )

Attributes

id class-attribute instance-attribute

id: Optional[str] = None

The unique immutable ID for this entity.

Functions

get_sts_storage_token_async async

get_sts_storage_token_async(permission: str, *, output_format: str = 'json', min_remaining_life: Optional[int] = None, synapse_client: Optional[Synapse] = None) -> Any

Get STS (AWS Security Token Service) credentials for direct access to the storage location backing this entity. These credentials can be used with AWS tools like awscli and boto3. Note: The entity must use a storage location that has STS enabled.

PARAMETER DESCRIPTION
permission

The permission level for the token. Must be 'read_only' or 'read_write'.

TYPE: str

output_format

The output format for the credentials. Options: 'json' (default), 'boto', 'shell', 'bash', 'cmd', 'powershell'.

TYPE: str DEFAULT: 'json'

min_remaining_life

The minimum remaining life (in seconds) for a cached token before a new one is fetched.

TYPE: Optional[int] DEFAULT: None

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Any

The STS credentials in the requested format.

RAISES DESCRIPTION
ValueError

If the entity does not have an id set.

Using credentials with boto3

Get STS credentials for an STS-enabled folder and use with boto3:

import asyncio
import boto3
from synapseclient import Synapse
from synapseclient.models import Folder

syn = Synapse()
syn.login()

async def main():
    folder = await Folder(id="syn123").get_async()
    credentials = await folder.get_sts_storage_token_async(
        permission="read_write",
        output_format="boto",
    )
    s3_client = boto3.client('s3', **credentials)

asyncio.run(main())
Source code in synapseclient/models/mixins/storage_location_mixin.py
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
@otel_trace_method(
    method_to_trace_name=lambda self, **kwargs: f"Entity_GetStsStorageToken: {self.id}"
)
async def get_sts_storage_token_async(
    self,
    permission: str,
    *,
    output_format: str = "json",
    min_remaining_life: Optional[int] = None,
    synapse_client: Optional[Synapse] = None,
) -> Any:
    """Get STS (AWS Security Token Service) credentials for direct access to
    the storage location backing this entity. These credentials can be used
    with AWS tools like awscli and boto3.
    Note: The entity must use a storage location that has STS enabled.

    Arguments:
        permission: The permission level for the token. Must be 'read_only'
            or 'read_write'.
        output_format: The output format for the credentials. Options:
            'json' (default), 'boto', 'shell', 'bash', 'cmd', 'powershell'.
        min_remaining_life: The minimum remaining life (in seconds) for a
            cached token before a new one is fetched.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The STS credentials in the requested format.

    Raises:
        ValueError: If the entity does not have an id set.

    Example: Using credentials with boto3
        Get STS credentials for an STS-enabled folder and use with boto3:

            import asyncio
            import boto3
            from synapseclient import Synapse
            from synapseclient.models import Folder

            syn = Synapse()
            syn.login()

            async def main():
                folder = await Folder(id="syn123").get_async()
                credentials = await folder.get_sts_storage_token_async(
                    permission="read_write",
                    output_format="boto",
                )
                s3_client = boto3.client('s3', **credentials)

            asyncio.run(main())
    """
    if not self.id:
        raise ValueError("The entity must have an id set.")

    from synapseclient.core import sts_transfer

    client = Synapse.get_client(synapse_client=synapse_client)

    return await asyncio.to_thread(
        sts_transfer.get_sts_credentials,
        client,
        self.id,
        permission,
        output_format=output_format,
        min_remaining_life=min_remaining_life,
    )

index_files_for_migration_async async

index_files_for_migration_async(dest_storage_location_id: int, db_path: Optional[str] = None, *, source_storage_location_ids: Optional[List[int]] = None, file_version_strategy: str = 'new', include_table_files: bool = False, continue_on_error: bool = False, synapse_client: Optional[Synapse] = None) -> MigrationResult

Index files in this entity for migration to a new storage location.

This is the first step in migrating files to a new storage location. After indexing, use migrate_indexed_files to perform the actual migration.

PARAMETER DESCRIPTION
dest_storage_location_id

The destination storage location ID.

TYPE: int

db_path

Path to the SQLite database file for tracking migration state. If not provided, a temporary directory will be used. The path can be retrieved from the returned MigrationResult.db_path.

TYPE: Optional[str] DEFAULT: None

source_storage_location_ids

Optional list of source storage location IDs to filter which files to migrate. If None, all files are indexed.

TYPE: Optional[List[int]] DEFAULT: None

file_version_strategy

Strategy for handling file versions. Options: 'new' (default) - create new versions, 'all' - migrate all versions, 'latest' - only migrate latest version, 'skip' - skip if file exists.

TYPE: str DEFAULT: 'new'

include_table_files

Whether to include files attached to tables.

TYPE: bool DEFAULT: False

continue_on_error

Whether to continue indexing if an error occurs.

TYPE: bool DEFAULT: False

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
MigrationResult

A MigrationResult object containing indexing statistics and the database

MigrationResult

path (accessible via result.db_path).

Indexing files for migration

Index files in a project for migration:

import asyncio
from synapseclient import Synapse
from synapseclient.models import Project

syn = Synapse()
syn.login()

async def main():
    project = await Project(id="syn123").get_async()
    result = await project.index_files_for_migration_async(
        dest_storage_location_id=12345,
    )
    print(f"Database path: {result.db_path}")
    print(f"Indexed {result.counts_by_status}")

asyncio.run(main())
Source code in synapseclient/models/mixins/storage_location_mixin.py
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
@otel_trace_method(
    method_to_trace_name=lambda self, **kwargs: f"Entity_IndexFilesForMigration: {self.id}"
)
async def index_files_for_migration_async(
    self,
    dest_storage_location_id: int,
    db_path: Optional[str] = None,
    *,
    source_storage_location_ids: Optional[List[int]] = None,
    file_version_strategy: str = "new",
    include_table_files: bool = False,
    continue_on_error: bool = False,
    synapse_client: Optional[Synapse] = None,
) -> MigrationResult:
    """Index files in this entity for migration to a new storage location.

    This is the first step in migrating files to a new storage location.
    After indexing, use `migrate_indexed_files` to perform the actual migration.

    Arguments:
        dest_storage_location_id: The destination storage location ID.
        db_path: Path to the SQLite database file for tracking migration state.
            If not provided, a temporary directory will be used. The path
            can be retrieved from the returned MigrationResult.db_path.
        source_storage_location_ids: Optional list of source storage location IDs
            to filter which files to migrate. If None, all files are indexed.
        file_version_strategy: Strategy for handling file versions. Options:
            'new' (default) - create new versions, 'all' - migrate all versions,
            'latest' - only migrate latest version, 'skip' - skip if file exists.
        include_table_files: Whether to include files attached to tables.
        continue_on_error: Whether to continue indexing if an error occurs.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        A MigrationResult object containing indexing statistics and the database
        path (accessible via result.db_path).

    Example: Indexing files for migration
        Index files in a project for migration:

            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Project

            syn = Synapse()
            syn.login()

            async def main():
                project = await Project(id="syn123").get_async()
                result = await project.index_files_for_migration_async(
                    dest_storage_location_id=12345,
                )
                print(f"Database path: {result.db_path}")
                print(f"Indexed {result.counts_by_status}")

            asyncio.run(main())
    """
    if not self.id:
        raise ValueError("The entity must have an id set.")

    return await _index_files_for_migration_async(
        self,
        dest_storage_location_id=str(dest_storage_location_id),
        db_path=db_path,
        source_storage_location_ids=(
            [str(s) for s in source_storage_location_ids]
            if source_storage_location_ids
            else None
        ),
        file_version_strategy=file_version_strategy,
        include_table_files=include_table_files,
        continue_on_error=continue_on_error,
        synapse_client=synapse_client,
    )

migrate_indexed_files_async async

migrate_indexed_files_async(db_path: str, *, create_table_snapshots: bool = True, continue_on_error: bool = False, force: bool = False, synapse_client: Optional[Synapse] = None) -> Optional[MigrationResult]

Migrate files that have been indexed with index_files_for_migration.

This is the second step in migrating files to a new storage location. Files must first be indexed using index_files_for_migration.

Interactive confirmation: When called from an interactive shell and force=False (the default), this method will print the number of items queued for migration and prompt for confirmation before proceeding. If standard output is not connected to an interactive terminal (e.g. a script or CI environment), migration is aborted unless force=True is set.

PARAMETER DESCRIPTION
db_path

Path to the SQLite database file created by index_files_for_migration. You can get this from the MigrationResult.db_path returned by index_files_for_migration.

TYPE: str

create_table_snapshots

Whether to create table snapshots before migrating table files.

TYPE: bool DEFAULT: True

continue_on_error

Whether to continue migration if an error occurs.

TYPE: bool DEFAULT: False

force

Skip the interactive confirmation prompt and proceed with migration automatically. Set to True when running non-interactively (scripts, CI, automated pipelines). Defaults to False.

TYPE: bool DEFAULT: False

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Optional[MigrationResult]

A MigrationResult object containing migration statistics, or None

Optional[MigrationResult]

if migration was aborted (user declined the confirmation prompt, or

Optional[MigrationResult]

the session is non-interactive and force=False).

Migrating indexed files

Migrate previously indexed files:

import asyncio
from synapseclient import Synapse
from synapseclient.models import Project

syn = Synapse()
syn.login()

async def main():
    project = await Project(id="syn123").get_async()

    # Index first
    index_result = await project.index_files_for_migration_async(
        dest_storage_location_id=12345,
    )

    # Then migrate using the db_path from index result
    result = await project.migrate_indexed_files_async(
        db_path=index_result.db_path,
        force=True,  # Skip interactive confirmation
    )
    print(f"Migrated {result.counts_by_status}")

asyncio.run(main())
Source code in synapseclient/models/mixins/storage_location_mixin.py
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
@otel_trace_method(
    method_to_trace_name=lambda self, **kwargs: f"Entity_MigrateIndexedFiles: {self.id}"
)
async def migrate_indexed_files_async(
    self,
    db_path: str,
    *,
    create_table_snapshots: bool = True,
    continue_on_error: bool = False,
    force: bool = False,
    synapse_client: Optional[Synapse] = None,
) -> Optional[MigrationResult]:
    """Migrate files that have been indexed with `index_files_for_migration`.

    This is the second step in migrating files to a new storage location.
    Files must first be indexed using `index_files_for_migration`.

    **Interactive confirmation:** When called from an interactive shell and
    ``force=False`` (the default), this method will print the number of items
    queued for migration and prompt for confirmation before proceeding. If
    standard output is not connected to an interactive terminal (e.g. a script
    or CI environment), migration is aborted unless ``force=True`` is set.

    Arguments:
        db_path: Path to the SQLite database file created by
            `index_files_for_migration`. You can get this from the
            MigrationResult.db_path returned by index_files_for_migration.
        create_table_snapshots: Whether to create table snapshots before
            migrating table files.
        continue_on_error: Whether to continue migration if an error occurs.
        force: Skip the interactive confirmation prompt and proceed with
            migration automatically. Set to ``True`` when running
            non-interactively (scripts, CI, automated pipelines).
            Defaults to False.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        A MigrationResult object containing migration statistics, or None
        if migration was aborted (user declined the confirmation prompt, or
        the session is non-interactive and force=False).

    Example: Migrating indexed files
        Migrate previously indexed files:

            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Project

            syn = Synapse()
            syn.login()

            async def main():
                project = await Project(id="syn123").get_async()

                # Index first
                index_result = await project.index_files_for_migration_async(
                    dest_storage_location_id=12345,
                )

                # Then migrate using the db_path from index result
                result = await project.migrate_indexed_files_async(
                    db_path=index_result.db_path,
                    force=True,  # Skip interactive confirmation
                )
                print(f"Migrated {result.counts_by_status}")

            asyncio.run(main())
    """
    if not self.id:
        raise ValueError("The entity must have an id set.")

    return await _migrate_indexed_files_async(
        db_path=db_path,
        create_table_snapshots=create_table_snapshots,
        continue_on_error=continue_on_error,
        force=force,
        synapse_client=synapse_client,
    )

synapseclient.models.mixins.ProjectSettingsMixin

Bases: StorageLocationConfigurable

Mixin for objects that can have their project settings configured.

Extends StorageLocationConfigurable with methods for managing project settings such as upload storage locations.

In order to use this mixin, the class must have an id attribute.

Source code in synapseclient/models/mixins/storage_location_mixin.py
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
@async_to_sync
class ProjectSettingsMixin(StorageLocationConfigurable):
    """Mixin for objects that can have their project settings configured.

    Extends StorageLocationConfigurable with methods for managing project
    settings such as upload storage locations.

    In order to use this mixin, the class must have an `id` attribute.
    """

    @otel_trace_method(
        method_to_trace_name=lambda self, **kwargs: f"Entity_SetStorageLocation: {self.id}"
    )
    async def set_storage_location_async(
        self,
        storage_location_id: Optional[
            Union[int, List[int]]
        ] = DEFAULT_STORAGE_LOCATION_ID,
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> "ProjectSetting":
        """Set the upload storage location for this entity. This configures where
        files uploaded to this entity will be stored.

        **This is a destructive update.** The provided `storage_location_id` value(s)
        will **replace** any storage locations previously configured on this entity.
        To add a storage location without removing existing ones, first retrieve the
        current setting via `get_project_setting_async`, append to its `locations`
        list, and call `store_async` on the returned `ProjectSetting` directly.

        Arguments:
            storage_location_id: The storage location ID(s) to set. Can be a single
                ID, a list of IDs (first is default, max 10). By default, the
                default Synapse S3 storage location is used.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The ProjectSetting object reflecting the current state after the operation.

        Raises:
            ValueError: If the entity does not have an id set.

        Example: Replace all storage locations
            Fully replace the storage location on a folder with a single location:

                import asyncio
                from synapseclient import Synapse
                from synapseclient.models import Folder

                syn = Synapse()
                syn.login()

                async def main():
                    folder = await Folder(id="syn123").get_async()
                    setting = await folder.set_storage_location_async(
                        storage_location_id=12345
                    )
                    print(setting)

                asyncio.run(main())

        Example: Partial update — add a storage location without removing existing ones
            Retrieve the current setting and append a new location:

                import asyncio
                from synapseclient import Synapse
                from synapseclient.models import Folder

                syn = Synapse()
                syn.login()

                async def main():
                    folder = await Folder(id="syn123").get_async()
                    setting = await folder.get_project_setting_async(setting_type="upload")
                    if setting:
                        setting.locations.append(67890)
                        await setting.store_async()

                asyncio.run(main())
        """
        if not self.id:
            raise ValueError("The entity must have an id set.")

        if storage_location_id is None:
            locations = [DEFAULT_STORAGE_LOCATION_ID]
        elif isinstance(storage_location_id, list):
            locations = storage_location_id
        else:
            locations = [storage_location_id]
        setting = await ProjectSetting(
            project_id=self.id, settings_type="upload"
        ).get_async(synapse_client=synapse_client)

        if setting is None:
            setting = ProjectSetting(
                project_id=self.id,
                settings_type="upload",
                locations=locations,
            )
        else:
            setting.locations = locations
        return await setting.store_async(synapse_client=synapse_client)

    @otel_trace_method(
        method_to_trace_name=lambda self, **kwargs: f"Entity_GetProjectSetting: {self.id}"
    )
    async def get_project_setting_async(
        self,
        setting_type: str = "upload",
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> Optional["ProjectSetting"]:
        """Get the project setting for this entity.

        Arguments:
            setting_type: The type of setting to retrieve. Currently only 'upload' is supported.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The ProjectSetting object, or None if no setting exists.

        Raises:
            ValueError: If the entity does not have an id set.

        Example: Using this function
            Get the upload settings for a folder:

                import asyncio
                from synapseclient import Synapse
                from synapseclient.models import Folder

                syn = Synapse()
                syn.login()

                async def main():
                    folder = await Folder(id="syn123").get_async()
                    setting = await folder.get_project_setting_async(setting_type="upload")
                    if setting:
                        print(f"Storage locations: {setting.locations}")

                asyncio.run(main())
        """
        if not self.id:
            raise ValueError("The entity must have an id set.")

        return await ProjectSetting(
            project_id=self.id, settings_type=setting_type
        ).get_async(synapse_client=synapse_client)

    @otel_trace_method(
        method_to_trace_name=lambda self, **kwargs: f"Entity_DeleteProjectSetting: {self.id}"
    )
    async def delete_project_setting_async(
        self,
        setting_id: str,
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> None:
        """Delete a project setting by its setting ID.

        Arguments:
            setting_id: The ID of the project setting to delete.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            None

        Raises:
            ValueError: If the entity does not have an id set.

        Example: Using this function
            Delete the upload settings for a folder:

                import asyncio
                from synapseclient import Synapse
                from synapseclient.models import Folder

                syn = Synapse()
                syn.login()

                async def main():
                    folder = await Folder(id="syn123").get_async()
                    await folder.delete_project_setting_async(setting_id="123")

                asyncio.run(main())
        """
        if not setting_id:
            raise ValueError("The id is required to delete a project setting.")
        await ProjectSetting(id=setting_id).delete_async(synapse_client=synapse_client)

Functions

set_storage_location_async async

set_storage_location_async(storage_location_id: Optional[Union[int, List[int]]] = DEFAULT_STORAGE_LOCATION_ID, *, synapse_client: Optional[Synapse] = None) -> ProjectSetting

Set the upload storage location for this entity. This configures where files uploaded to this entity will be stored.

This is a destructive update. The provided storage_location_id value(s) will replace any storage locations previously configured on this entity. To add a storage location without removing existing ones, first retrieve the current setting via get_project_setting_async, append to its locations list, and call store_async on the returned ProjectSetting directly.

PARAMETER DESCRIPTION
storage_location_id

The storage location ID(s) to set. Can be a single ID, a list of IDs (first is default, max 10). By default, the default Synapse S3 storage location is used.

TYPE: Optional[Union[int, List[int]]] DEFAULT: DEFAULT_STORAGE_LOCATION_ID

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
ProjectSetting

The ProjectSetting object reflecting the current state after the operation.

RAISES DESCRIPTION
ValueError

If the entity does not have an id set.

Replace all storage locations

Fully replace the storage location on a folder with a single location:

import asyncio
from synapseclient import Synapse
from synapseclient.models import Folder

syn = Synapse()
syn.login()

async def main():
    folder = await Folder(id="syn123").get_async()
    setting = await folder.set_storage_location_async(
        storage_location_id=12345
    )
    print(setting)

asyncio.run(main())
Partial update — add a storage location without removing existing ones

Retrieve the current setting and append a new location:

import asyncio
from synapseclient import Synapse
from synapseclient.models import Folder

syn = Synapse()
syn.login()

async def main():
    folder = await Folder(id="syn123").get_async()
    setting = await folder.get_project_setting_async(setting_type="upload")
    if setting:
        setting.locations.append(67890)
        await setting.store_async()

asyncio.run(main())
Source code in synapseclient/models/mixins/storage_location_mixin.py
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
@otel_trace_method(
    method_to_trace_name=lambda self, **kwargs: f"Entity_SetStorageLocation: {self.id}"
)
async def set_storage_location_async(
    self,
    storage_location_id: Optional[
        Union[int, List[int]]
    ] = DEFAULT_STORAGE_LOCATION_ID,
    *,
    synapse_client: Optional[Synapse] = None,
) -> "ProjectSetting":
    """Set the upload storage location for this entity. This configures where
    files uploaded to this entity will be stored.

    **This is a destructive update.** The provided `storage_location_id` value(s)
    will **replace** any storage locations previously configured on this entity.
    To add a storage location without removing existing ones, first retrieve the
    current setting via `get_project_setting_async`, append to its `locations`
    list, and call `store_async` on the returned `ProjectSetting` directly.

    Arguments:
        storage_location_id: The storage location ID(s) to set. Can be a single
            ID, a list of IDs (first is default, max 10). By default, the
            default Synapse S3 storage location is used.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The ProjectSetting object reflecting the current state after the operation.

    Raises:
        ValueError: If the entity does not have an id set.

    Example: Replace all storage locations
        Fully replace the storage location on a folder with a single location:

            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Folder

            syn = Synapse()
            syn.login()

            async def main():
                folder = await Folder(id="syn123").get_async()
                setting = await folder.set_storage_location_async(
                    storage_location_id=12345
                )
                print(setting)

            asyncio.run(main())

    Example: Partial update — add a storage location without removing existing ones
        Retrieve the current setting and append a new location:

            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Folder

            syn = Synapse()
            syn.login()

            async def main():
                folder = await Folder(id="syn123").get_async()
                setting = await folder.get_project_setting_async(setting_type="upload")
                if setting:
                    setting.locations.append(67890)
                    await setting.store_async()

            asyncio.run(main())
    """
    if not self.id:
        raise ValueError("The entity must have an id set.")

    if storage_location_id is None:
        locations = [DEFAULT_STORAGE_LOCATION_ID]
    elif isinstance(storage_location_id, list):
        locations = storage_location_id
    else:
        locations = [storage_location_id]
    setting = await ProjectSetting(
        project_id=self.id, settings_type="upload"
    ).get_async(synapse_client=synapse_client)

    if setting is None:
        setting = ProjectSetting(
            project_id=self.id,
            settings_type="upload",
            locations=locations,
        )
    else:
        setting.locations = locations
    return await setting.store_async(synapse_client=synapse_client)

get_project_setting_async async

get_project_setting_async(setting_type: str = 'upload', *, synapse_client: Optional[Synapse] = None) -> Optional[ProjectSetting]

Get the project setting for this entity.

PARAMETER DESCRIPTION
setting_type

The type of setting to retrieve. Currently only 'upload' is supported.

TYPE: str DEFAULT: 'upload'

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Optional[ProjectSetting]

The ProjectSetting object, or None if no setting exists.

RAISES DESCRIPTION
ValueError

If the entity does not have an id set.

Using this function

Get the upload settings for a folder:

import asyncio
from synapseclient import Synapse
from synapseclient.models import Folder

syn = Synapse()
syn.login()

async def main():
    folder = await Folder(id="syn123").get_async()
    setting = await folder.get_project_setting_async(setting_type="upload")
    if setting:
        print(f"Storage locations: {setting.locations}")

asyncio.run(main())
Source code in synapseclient/models/mixins/storage_location_mixin.py
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
@otel_trace_method(
    method_to_trace_name=lambda self, **kwargs: f"Entity_GetProjectSetting: {self.id}"
)
async def get_project_setting_async(
    self,
    setting_type: str = "upload",
    *,
    synapse_client: Optional[Synapse] = None,
) -> Optional["ProjectSetting"]:
    """Get the project setting for this entity.

    Arguments:
        setting_type: The type of setting to retrieve. Currently only 'upload' is supported.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The ProjectSetting object, or None if no setting exists.

    Raises:
        ValueError: If the entity does not have an id set.

    Example: Using this function
        Get the upload settings for a folder:

            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Folder

            syn = Synapse()
            syn.login()

            async def main():
                folder = await Folder(id="syn123").get_async()
                setting = await folder.get_project_setting_async(setting_type="upload")
                if setting:
                    print(f"Storage locations: {setting.locations}")

            asyncio.run(main())
    """
    if not self.id:
        raise ValueError("The entity must have an id set.")

    return await ProjectSetting(
        project_id=self.id, settings_type=setting_type
    ).get_async(synapse_client=synapse_client)

delete_project_setting_async async

delete_project_setting_async(setting_id: str, *, synapse_client: Optional[Synapse] = None) -> None

Delete a project setting by its setting ID.

PARAMETER DESCRIPTION
setting_id

The ID of the project setting to delete.

TYPE: str

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
None

None

RAISES DESCRIPTION
ValueError

If the entity does not have an id set.

Using this function

Delete the upload settings for a folder:

import asyncio
from synapseclient import Synapse
from synapseclient.models import Folder

syn = Synapse()
syn.login()

async def main():
    folder = await Folder(id="syn123").get_async()
    await folder.delete_project_setting_async(setting_id="123")

asyncio.run(main())
Source code in synapseclient/models/mixins/storage_location_mixin.py
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
@otel_trace_method(
    method_to_trace_name=lambda self, **kwargs: f"Entity_DeleteProjectSetting: {self.id}"
)
async def delete_project_setting_async(
    self,
    setting_id: str,
    *,
    synapse_client: Optional[Synapse] = None,
) -> None:
    """Delete a project setting by its setting ID.

    Arguments:
        setting_id: The ID of the project setting to delete.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        None

    Raises:
        ValueError: If the entity does not have an id set.

    Example: Using this function
        Delete the upload settings for a folder:

            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import Folder

            syn = Synapse()
            syn.login()

            async def main():
                folder = await Folder(id="syn123").get_async()
                await folder.delete_project_setting_async(setting_id="123")

            asyncio.run(main())
    """
    if not setting_id:
        raise ValueError("The id is required to delete a project setting.")
    await ProjectSetting(id=setting_id).delete_async(synapse_client=synapse_client)

synapseclient.models.protocols.storage_location_mixin_protocol.StorageLocationConfigurableSynchronousProtocol

Bases: Protocol

The protocol for methods that are asynchronous but also have a synchronous counterpart that may also be called.

Source code in synapseclient/models/protocols/storage_location_mixin_protocol.py
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
class StorageLocationConfigurableSynchronousProtocol(Protocol):
    """
    The protocol for methods that are asynchronous but also
    have a synchronous counterpart that may also be called.
    """

    def set_storage_location(
        self,
        storage_location_id: Optional[Union[int, List[int]]] = None,
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> Dict[str, Any]:
        """Set the upload storage location for this entity. This configures where
        files uploaded to this entity will be stored.

        Arguments:
            storage_location_id: The storage location ID(s) to set. Can be a single
                ID, a list of IDs (first is default, max 10), or None to use
                Synapse default storage.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The project setting dict returned from Synapse.

        Raises:
            ValueError: If the entity does not have an id set.

        Example: Setting storage location on a folder
            Set storage location on a folder:

                from synapseclient.models import Folder

                import synapseclient
                synapseclient.login()

                folder = Folder(id="syn123").get()
                setting = folder.set_storage_location(storage_location_id=12345)
                print(setting)
        """
        return {}

    def get_project_setting(
        self,
        setting_type: str = "upload",
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> Optional[Dict[str, Any]]:
        """Get the project setting for this entity.

        Arguments:
            setting_type: The type of setting to retrieve. One of:
                'upload', 'external_sync', 'requester_pays'. Default: 'upload'.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The project setting as a dictionary, or None if no setting exists.

        Raises:
            ValueError: If the entity does not have an id set.

        Example: Getting project settings
            Get the upload settings for a folder:

                from synapseclient.models import Folder

                import synapseclient
                synapseclient.login()

                folder = Folder(id="syn123").get()
                setting = folder.get_project_setting(setting_type="upload")
                if setting:
                    print(f"Storage locations: {setting.locations}")
        """
        return {}

    def delete_project_setting(
        self,
        setting_id: str,
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> None:
        """Delete a project setting by its setting ID.

        Arguments:
            setting_id: The ID of the project setting to delete.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            None

        Raises:
            ValueError: If the entity does not have an id set.

        Example: Deleting a project setting
            Delete the upload settings for a folder:

                from synapseclient.models import Folder

                import synapseclient
                synapseclient.login()

                folder = Folder(id="syn123").get()
                setting = folder.get_project_setting(setting_type="upload")
                if setting:
                    folder.delete_project_setting(setting_id=setting['id'])
        """
        return None

    def get_sts_storage_token(
        self,
        permission: str,
        *,
        output_format: str = "json",
        min_remaining_life: Optional[int] = None,
        synapse_client: Optional[Synapse] = None,
    ) -> Any:
        """Get STS (AWS Security Token Service) credentials for direct access to
        the storage location backing this entity. These credentials can be used
        with AWS tools like awscli and boto3.

        Arguments:
            permission: The permission level for the token. Must be 'read_only'
                or 'read_write'.
            output_format: The output format for the credentials. Options:
                'json' (default), 'boto', 'shell', 'bash', 'cmd', 'powershell'.
            min_remaining_life: The minimum remaining life (in seconds) for a
                cached token before a new one is fetched.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The STS credentials in the requested format.

        Raises:
            ValueError: If the entity does not have an id set.

        Example: Using credentials with boto3
            Get STS credentials for an STS-enabled folder and use with boto3:

                import boto3
                from synapseclient.models import Folder

                import synapseclient
                synapseclient.login()

                folder = Folder(id="syn123").get()
                credentials = folder.get_sts_storage_token(
                    permission="read_write",
                    output_format="boto",
                )
                s3_client = boto3.client('s3', **credentials)
        """
        return {}

    def index_files_for_migration(
        self,
        dest_storage_location_id: int,
        db_path: Optional[str] = None,
        *,
        source_storage_location_ids: Optional[List[int]] = None,
        file_version_strategy: str = "new",
        include_table_files: bool = False,
        continue_on_error: bool = False,
        synapse_client: Optional[Synapse] = None,
    ) -> "MigrationResult":
        """Index files in this entity for migration to a new storage location.

        This is the first step in migrating files to a new storage location.
        After indexing, use `migrate_indexed_files` to perform the actual migration.

        Arguments:
            dest_storage_location_id: The destination storage location ID.
            db_path: Path to the SQLite database file for tracking migration state.
                If not provided, a temporary directory will be used. The path
                can be retrieved from the returned MigrationResult.db_path.
            source_storage_location_ids: Optional list of source storage location IDs
                to filter which files to migrate. If None, all files are indexed.
            file_version_strategy: Strategy for handling file versions. Options:
                'new' (default) - create new versions, 'all' - migrate all versions,
                'latest' - only migrate latest version, 'skip' - skip if file exists.
            include_table_files: Whether to include files attached to tables.
            continue_on_error: Whether to continue indexing if an error occurs.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            A MigrationResult object containing indexing statistics and the database
            path (accessible via result.db_path).

        Example: Indexing files for migration
            Index files in a project for migration:

                from synapseclient.models import Project

                import synapseclient
                synapseclient.login()

                project = Project(id="syn123").get()
                result = project.index_files_for_migration(
                    dest_storage_location_id=12345,
                )
                print(f"Database path: {result.db_path}")
                print(f"Indexed {result.counts_by_status}")
        """
        return None

    def migrate_indexed_files(
        self,
        db_path: str,
        *,
        create_table_snapshots: bool = True,
        continue_on_error: bool = False,
        force: bool = False,
        synapse_client: Optional[Synapse] = None,
    ) -> Optional["MigrationResult"]:
        """Migrate files that have been indexed with `index_files_for_migration`.

        This is the second step in migrating files to a new storage location.
        Files must first be indexed using `index_files_for_migration`.

        Arguments:
            db_path: Path to the SQLite database file created by
                `index_files_for_migration`. You can get this from the
                MigrationResult.db_path returned by index_files_for_migration.
            create_table_snapshots: Whether to create table snapshots before
                migrating table files.
            continue_on_error: Whether to continue migration if an error occurs.
            force: Whether to force migration of files that have already been
                migrated. Also bypasses interactive confirmation.
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            A MigrationResult object containing migration statistics, or None
            if the user declined the confirmation prompt.

        Example: Migrating indexed files
            Migrate previously indexed files:

                from synapseclient.models import Project

                import synapseclient
                synapseclient.login()

                project = Project(id="syn123").get()

                # Index first
                index_result = project.index_files_for_migration(
                    dest_storage_location_id=12345,
                )

                # Then migrate using the db_path from index result
                result = project.migrate_indexed_files(
                    db_path=index_result.db_path,
                    force=True,  # Skip interactive confirmation
                )
                print(f"Migrated {result.counts_by_status}")
        """
        return None

Functions

set_storage_location

set_storage_location(storage_location_id: Optional[Union[int, List[int]]] = None, *, synapse_client: Optional[Synapse] = None) -> Dict[str, Any]

Set the upload storage location for this entity. This configures where files uploaded to this entity will be stored.

PARAMETER DESCRIPTION
storage_location_id

The storage location ID(s) to set. Can be a single ID, a list of IDs (first is default, max 10), or None to use Synapse default storage.

TYPE: Optional[Union[int, List[int]]] DEFAULT: None

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Dict[str, Any]

The project setting dict returned from Synapse.

RAISES DESCRIPTION
ValueError

If the entity does not have an id set.

Setting storage location on a folder

Set storage location on a folder:

from synapseclient.models import Folder

import synapseclient
synapseclient.login()

folder = Folder(id="syn123").get()
setting = folder.set_storage_location(storage_location_id=12345)
print(setting)
Source code in synapseclient/models/protocols/storage_location_mixin_protocol.py
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
def set_storage_location(
    self,
    storage_location_id: Optional[Union[int, List[int]]] = None,
    *,
    synapse_client: Optional[Synapse] = None,
) -> Dict[str, Any]:
    """Set the upload storage location for this entity. This configures where
    files uploaded to this entity will be stored.

    Arguments:
        storage_location_id: The storage location ID(s) to set. Can be a single
            ID, a list of IDs (first is default, max 10), or None to use
            Synapse default storage.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The project setting dict returned from Synapse.

    Raises:
        ValueError: If the entity does not have an id set.

    Example: Setting storage location on a folder
        Set storage location on a folder:

            from synapseclient.models import Folder

            import synapseclient
            synapseclient.login()

            folder = Folder(id="syn123").get()
            setting = folder.set_storage_location(storage_location_id=12345)
            print(setting)
    """
    return {}

get_project_setting

get_project_setting(setting_type: str = 'upload', *, synapse_client: Optional[Synapse] = None) -> Optional[Dict[str, Any]]

Get the project setting for this entity.

PARAMETER DESCRIPTION
setting_type

The type of setting to retrieve. One of: 'upload', 'external_sync', 'requester_pays'. Default: 'upload'.

TYPE: str DEFAULT: 'upload'

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Optional[Dict[str, Any]]

The project setting as a dictionary, or None if no setting exists.

RAISES DESCRIPTION
ValueError

If the entity does not have an id set.

Getting project settings

Get the upload settings for a folder:

from synapseclient.models import Folder

import synapseclient
synapseclient.login()

folder = Folder(id="syn123").get()
setting = folder.get_project_setting(setting_type="upload")
if setting:
    print(f"Storage locations: {setting.locations}")
Source code in synapseclient/models/protocols/storage_location_mixin_protocol.py
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
def get_project_setting(
    self,
    setting_type: str = "upload",
    *,
    synapse_client: Optional[Synapse] = None,
) -> Optional[Dict[str, Any]]:
    """Get the project setting for this entity.

    Arguments:
        setting_type: The type of setting to retrieve. One of:
            'upload', 'external_sync', 'requester_pays'. Default: 'upload'.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The project setting as a dictionary, or None if no setting exists.

    Raises:
        ValueError: If the entity does not have an id set.

    Example: Getting project settings
        Get the upload settings for a folder:

            from synapseclient.models import Folder

            import synapseclient
            synapseclient.login()

            folder = Folder(id="syn123").get()
            setting = folder.get_project_setting(setting_type="upload")
            if setting:
                print(f"Storage locations: {setting.locations}")
    """
    return {}

delete_project_setting

delete_project_setting(setting_id: str, *, synapse_client: Optional[Synapse] = None) -> None

Delete a project setting by its setting ID.

PARAMETER DESCRIPTION
setting_id

The ID of the project setting to delete.

TYPE: str

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
None

None

RAISES DESCRIPTION
ValueError

If the entity does not have an id set.

Deleting a project setting

Delete the upload settings for a folder:

from synapseclient.models import Folder

import synapseclient
synapseclient.login()

folder = Folder(id="syn123").get()
setting = folder.get_project_setting(setting_type="upload")
if setting:
    folder.delete_project_setting(setting_id=setting['id'])
Source code in synapseclient/models/protocols/storage_location_mixin_protocol.py
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
def delete_project_setting(
    self,
    setting_id: str,
    *,
    synapse_client: Optional[Synapse] = None,
) -> None:
    """Delete a project setting by its setting ID.

    Arguments:
        setting_id: The ID of the project setting to delete.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        None

    Raises:
        ValueError: If the entity does not have an id set.

    Example: Deleting a project setting
        Delete the upload settings for a folder:

            from synapseclient.models import Folder

            import synapseclient
            synapseclient.login()

            folder = Folder(id="syn123").get()
            setting = folder.get_project_setting(setting_type="upload")
            if setting:
                folder.delete_project_setting(setting_id=setting['id'])
    """
    return None

get_sts_storage_token

get_sts_storage_token(permission: str, *, output_format: str = 'json', min_remaining_life: Optional[int] = None, synapse_client: Optional[Synapse] = None) -> Any

Get STS (AWS Security Token Service) credentials for direct access to the storage location backing this entity. These credentials can be used with AWS tools like awscli and boto3.

PARAMETER DESCRIPTION
permission

The permission level for the token. Must be 'read_only' or 'read_write'.

TYPE: str

output_format

The output format for the credentials. Options: 'json' (default), 'boto', 'shell', 'bash', 'cmd', 'powershell'.

TYPE: str DEFAULT: 'json'

min_remaining_life

The minimum remaining life (in seconds) for a cached token before a new one is fetched.

TYPE: Optional[int] DEFAULT: None

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Any

The STS credentials in the requested format.

RAISES DESCRIPTION
ValueError

If the entity does not have an id set.

Using credentials with boto3

Get STS credentials for an STS-enabled folder and use with boto3:

import boto3
from synapseclient.models import Folder

import synapseclient
synapseclient.login()

folder = Folder(id="syn123").get()
credentials = folder.get_sts_storage_token(
    permission="read_write",
    output_format="boto",
)
s3_client = boto3.client('s3', **credentials)
Source code in synapseclient/models/protocols/storage_location_mixin_protocol.py
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
def get_sts_storage_token(
    self,
    permission: str,
    *,
    output_format: str = "json",
    min_remaining_life: Optional[int] = None,
    synapse_client: Optional[Synapse] = None,
) -> Any:
    """Get STS (AWS Security Token Service) credentials for direct access to
    the storage location backing this entity. These credentials can be used
    with AWS tools like awscli and boto3.

    Arguments:
        permission: The permission level for the token. Must be 'read_only'
            or 'read_write'.
        output_format: The output format for the credentials. Options:
            'json' (default), 'boto', 'shell', 'bash', 'cmd', 'powershell'.
        min_remaining_life: The minimum remaining life (in seconds) for a
            cached token before a new one is fetched.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The STS credentials in the requested format.

    Raises:
        ValueError: If the entity does not have an id set.

    Example: Using credentials with boto3
        Get STS credentials for an STS-enabled folder and use with boto3:

            import boto3
            from synapseclient.models import Folder

            import synapseclient
            synapseclient.login()

            folder = Folder(id="syn123").get()
            credentials = folder.get_sts_storage_token(
                permission="read_write",
                output_format="boto",
            )
            s3_client = boto3.client('s3', **credentials)
    """
    return {}

index_files_for_migration

index_files_for_migration(dest_storage_location_id: int, db_path: Optional[str] = None, *, source_storage_location_ids: Optional[List[int]] = None, file_version_strategy: str = 'new', include_table_files: bool = False, continue_on_error: bool = False, synapse_client: Optional[Synapse] = None) -> MigrationResult

Index files in this entity for migration to a new storage location.

This is the first step in migrating files to a new storage location. After indexing, use migrate_indexed_files to perform the actual migration.

PARAMETER DESCRIPTION
dest_storage_location_id

The destination storage location ID.

TYPE: int

db_path

Path to the SQLite database file for tracking migration state. If not provided, a temporary directory will be used. The path can be retrieved from the returned MigrationResult.db_path.

TYPE: Optional[str] DEFAULT: None

source_storage_location_ids

Optional list of source storage location IDs to filter which files to migrate. If None, all files are indexed.

TYPE: Optional[List[int]] DEFAULT: None

file_version_strategy

Strategy for handling file versions. Options: 'new' (default) - create new versions, 'all' - migrate all versions, 'latest' - only migrate latest version, 'skip' - skip if file exists.

TYPE: str DEFAULT: 'new'

include_table_files

Whether to include files attached to tables.

TYPE: bool DEFAULT: False

continue_on_error

Whether to continue indexing if an error occurs.

TYPE: bool DEFAULT: False

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
MigrationResult

A MigrationResult object containing indexing statistics and the database

MigrationResult

path (accessible via result.db_path).

Indexing files for migration

Index files in a project for migration:

from synapseclient.models import Project

import synapseclient
synapseclient.login()

project = Project(id="syn123").get()
result = project.index_files_for_migration(
    dest_storage_location_id=12345,
)
print(f"Database path: {result.db_path}")
print(f"Indexed {result.counts_by_status}")
Source code in synapseclient/models/protocols/storage_location_mixin_protocol.py
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
def index_files_for_migration(
    self,
    dest_storage_location_id: int,
    db_path: Optional[str] = None,
    *,
    source_storage_location_ids: Optional[List[int]] = None,
    file_version_strategy: str = "new",
    include_table_files: bool = False,
    continue_on_error: bool = False,
    synapse_client: Optional[Synapse] = None,
) -> "MigrationResult":
    """Index files in this entity for migration to a new storage location.

    This is the first step in migrating files to a new storage location.
    After indexing, use `migrate_indexed_files` to perform the actual migration.

    Arguments:
        dest_storage_location_id: The destination storage location ID.
        db_path: Path to the SQLite database file for tracking migration state.
            If not provided, a temporary directory will be used. The path
            can be retrieved from the returned MigrationResult.db_path.
        source_storage_location_ids: Optional list of source storage location IDs
            to filter which files to migrate. If None, all files are indexed.
        file_version_strategy: Strategy for handling file versions. Options:
            'new' (default) - create new versions, 'all' - migrate all versions,
            'latest' - only migrate latest version, 'skip' - skip if file exists.
        include_table_files: Whether to include files attached to tables.
        continue_on_error: Whether to continue indexing if an error occurs.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        A MigrationResult object containing indexing statistics and the database
        path (accessible via result.db_path).

    Example: Indexing files for migration
        Index files in a project for migration:

            from synapseclient.models import Project

            import synapseclient
            synapseclient.login()

            project = Project(id="syn123").get()
            result = project.index_files_for_migration(
                dest_storage_location_id=12345,
            )
            print(f"Database path: {result.db_path}")
            print(f"Indexed {result.counts_by_status}")
    """
    return None

migrate_indexed_files

migrate_indexed_files(db_path: str, *, create_table_snapshots: bool = True, continue_on_error: bool = False, force: bool = False, synapse_client: Optional[Synapse] = None) -> Optional[MigrationResult]

Migrate files that have been indexed with index_files_for_migration.

This is the second step in migrating files to a new storage location. Files must first be indexed using index_files_for_migration.

PARAMETER DESCRIPTION
db_path

Path to the SQLite database file created by index_files_for_migration. You can get this from the MigrationResult.db_path returned by index_files_for_migration.

TYPE: str

create_table_snapshots

Whether to create table snapshots before migrating table files.

TYPE: bool DEFAULT: True

continue_on_error

Whether to continue migration if an error occurs.

TYPE: bool DEFAULT: False

force

Whether to force migration of files that have already been migrated. Also bypasses interactive confirmation.

TYPE: bool DEFAULT: False

synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
Optional[MigrationResult]

A MigrationResult object containing migration statistics, or None

Optional[MigrationResult]

if the user declined the confirmation prompt.

Migrating indexed files

Migrate previously indexed files:

from synapseclient.models import Project

import synapseclient
synapseclient.login()

project = Project(id="syn123").get()

# Index first
index_result = project.index_files_for_migration(
    dest_storage_location_id=12345,
)

# Then migrate using the db_path from index result
result = project.migrate_indexed_files(
    db_path=index_result.db_path,
    force=True,  # Skip interactive confirmation
)
print(f"Migrated {result.counts_by_status}")
Source code in synapseclient/models/protocols/storage_location_mixin_protocol.py
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
def migrate_indexed_files(
    self,
    db_path: str,
    *,
    create_table_snapshots: bool = True,
    continue_on_error: bool = False,
    force: bool = False,
    synapse_client: Optional[Synapse] = None,
) -> Optional["MigrationResult"]:
    """Migrate files that have been indexed with `index_files_for_migration`.

    This is the second step in migrating files to a new storage location.
    Files must first be indexed using `index_files_for_migration`.

    Arguments:
        db_path: Path to the SQLite database file created by
            `index_files_for_migration`. You can get this from the
            MigrationResult.db_path returned by index_files_for_migration.
        create_table_snapshots: Whether to create table snapshots before
            migrating table files.
        continue_on_error: Whether to continue migration if an error occurs.
        force: Whether to force migration of files that have already been
            migrated. Also bypasses interactive confirmation.
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        A MigrationResult object containing migration statistics, or None
        if the user declined the confirmation prompt.

    Example: Migrating indexed files
        Migrate previously indexed files:

            from synapseclient.models import Project

            import synapseclient
            synapseclient.login()

            project = Project(id="syn123").get()

            # Index first
            index_result = project.index_files_for_migration(
                dest_storage_location_id=12345,
            )

            # Then migrate using the db_path from index result
            result = project.migrate_indexed_files(
                db_path=index_result.db_path,
                force=True,  # Skip interactive confirmation
            )
            print(f"Migrated {result.counts_by_status}")
    """
    return None