Repositories

Organizations

class Organizations(client_api: dtlpy.services.api_client.ApiClient)[source]

Bases: object

organizations repository

add_member(email, role: dtlpy.entities.organization.MemberOrgRole = MemberOrgRole.MEMBER, organization_id: Optional[str] = None, organization_name: Optional[str] = None, organization: Optional[dtlpy.entities.organization.Organization] = None)[source]

Add member to the Organization :param email: :param role: MemberOrgRole.ADMIN ,MemberOrgRole.OWNER ,MemberOrgRole.MEMBER :param organization_id: :param organization_name: :param organization: :return: True

create(organization_json) dtlpy.entities.organization.Organization[source]

Create a new pipeline :param organization_json: json contain the Organization fields :return: Pipeline object

delete_member(user_id: str, organization_id: Optional[str] = None, organization_name: Optional[str] = None, organization: Optional[dtlpy.entities.organization.Organization] = None, sure: bool = False, really: bool = False) bool[source]

delete member from the Organization :param user_id: :param organization_id: :param organization_name: :param organization: :param sure: are you sure you want to delete? :param really: really really? :return: True

get(organization_id: Optional[str] = None, organization_name: Optional[str] = None, fetch: Optional[bool] = None) dtlpy.entities.organization.Organization[source]

Get a Organization object :param organization_id: optional - search by id :param organization_name: optional - search by name :param fetch: optional - fetch entity from platform, default taken from cookie :return: Organization object

list() dtlpy.miscellaneous.list_print.List[dtlpy.entities.organization.Organization][source]

Get Organization’s list. :return: List of Organization objects

list_groups(organization: Optional[dtlpy.entities.organization.Organization] = None, organization_id: Optional[str] = None, organization_name: Optional[str] = None)[source]

list all organization groups :param organization: :param organization_id: :param organization_name: :return groups list:

list_integrations(organization: Optional[dtlpy.entities.organization.Organization] = None, organization_id: Optional[str] = None, organization_name: Optional[str] = None, only_available=False)[source]

list all organization integrations :param organization: :param organization_id: :param organization_name: :param only_available: bool - if True list only the available integrations :return groups list:

list_members(organization: Optional[dtlpy.entities.organization.Organization] = None, organization_id: Optional[str] = None, organization_name: Optional[str] = None, role: Optional[dtlpy.entities.organization.MemberOrgRole] = None)[source]

list all organization members :param organization: :param organization_id: :param organization_name: :param role: MemberOrgRole.ADMIN ,MemberOrgRole.OWNER ,MemberOrgRole.MEMBER :return projects list:

update(plan: str, organization: Optional[dtlpy.entities.organization.Organization] = None, organization_id: Optional[str] = None, organization_name: Optional[str] = None) dtlpy.entities.organization.Organization[source]

Update a organization :param plan: OrganizationsPlans.FREEMIUM, OrganizationsPlans.PREMIUM :param organization: :param organization_id: :param organization_name: :return: organization object

update_member(email: str, role: dtlpy.entities.organization.MemberOrgRole = MemberOrgRole.MEMBER, organization_id: Optional[str] = None, organization_name: Optional[str] = None, organization: Optional[dtlpy.entities.organization.Organization] = None)[source]

Update the member role :param email: :param role: MemberOrgRole.ADMIN ,MemberOrgRole.OWNER ,MemberOrgRole.MEMBER :param organization_id: :param organization_name: :param organization:

Integrations

Integrations Repository

class Integrations(client_api: dtlpy.services.api_client.ApiClient, org: Optional[dtlpy.entities.organization.Organization] = None, project: Optional[dtlpy.entities.project.Project] = None)[source]

Bases: object

Datasets repository

create(integrations_type: dtlpy.entities.driver.ExternalStorage, name, options)[source]

Add integrations to the Organization :param integrations_type: dl.ExternalStorage :param name: integrations name :param options: s3 - {key: “”, secret: “”},

gcs - {key: “”, secret: “”, content: “”}, azureblob - {key: “”, secret: “”, clientId: “”, tenantId: “”} key_value - {key: “”, value: “”}

Returns

True

delete(integrations_id: str, sure: bool = False, really: bool = False) bool[source]

Delete integrations from the Organization :param integrations_id: :param sure: are you sure you want to delete? :param really: really really? :return: True

get(integrations_id: str)[source]

get organization integrations :param integrations_id: :return organization integrations:

list(only_available=False)[source]

list all organization integrations :param only_available: bool - if True list only the available integrations :return groups list:

update(new_name: str, integrations_id)[source]

Update the integrations name :param new_name: :param integrations_id:

Projects

class Projects(client_api: dtlpy.services.api_client.ApiClient, org=None)[source]

Bases: object

Projects repository

add_member(email: str, project_id: str, role: dtlpy.entities.project.MemberRole = MemberRole.DEVELOPER)[source]
Parameters
  • email

  • project_id

  • role – “owner” ,”engineer” ,”annotator” ,”annotationManager”

checkout(identifier: Optional[str] = None, project_name: Optional[str] = None, project_id: Optional[str] = None, project: Optional[dtlpy.entities.project.Project] = None)[source]

Check-out a project :param identifier: project name or partial id :param project_name: :param project_id: :param project: :return:

create(project_name: str, checkout: bool = False) dtlpy.entities.project.Project[source]

Create a new project :param project_name: :param checkout: :return: Project object

delete(project_name: Optional[str] = None, project_id: Optional[str] = None, sure: bool = False, really: bool = False) bool[source]

Delete a project forever! :param project_name: optional - search by name :param project_id: optional - search by id :param sure: are you sure you want to delete? :param really: really really?

Returns

True

get(project_name: Optional[str] = None, project_id: Optional[str] = None, checkout: bool = False, fetch: Optional[bool] = None) dtlpy.entities.project.Project[source]

Get a Project object :param project_name: optional - search by name :param project_id: optional - search by id :param checkout: :param fetch: optional - fetch entity from platform, default taken from cookie :return: Project object

list() dtlpy.miscellaneous.list_print.List[dtlpy.entities.project.Project][source]

Get users project’s list. :return: List of Project objects

list_members(project: dtlpy.entities.project.Project, role: Optional[dtlpy.entities.project.MemberRole] = None)[source]
Parameters
  • project

  • role – “owner” ,”engineer” ,”annotator” ,”annotationManager”

open_in_web(project_name: Optional[str] = None, project_id: Optional[str] = None, project: Optional[dtlpy.entities.project.Project] = None)[source]
Parameters
  • project_name

  • project_id

  • project

remove_member(email: str, project_id: str)[source]
Parameters
  • email

  • project_id

update(project: dtlpy.entities.project.Project, system_metadata: bool = False) dtlpy.entities.project.Project[source]

Update a project :param project: :param system_metadata: True, if you want to change metadata system :return: Project object

update_member(email: str, project_id: str, role: dtlpy.entities.project.MemberRole = MemberRole.DEVELOPER)[source]
Parameters
  • email

  • project_id

  • role – “owner” ,”engineer” ,”annotator” ,”annotationManager”

Datasets

Datasets Repository

class Datasets(client_api: dtlpy.services.api_client.ApiClient, project: Optional[dtlpy.entities.project.Project] = None)[source]

Bases: object

Datasets repository

checkout(identifier=None, dataset_name=None, dataset_id=None, dataset=None)[source]

Check-out a project :param identifier: project name or partial id :param dataset_name: :param dataset_id: :param dataset: :return:

clone(dataset_id, clone_name, filters=None, with_items_annotations=True, with_metadata=True, with_task_annotations_status=True)[source]

Clone a dataset

Parameters
  • dataset_id – to clone dataset

  • clone_name – new dataset name

  • filters – Filters entity or a query dict

  • with_items_annotations

  • with_metadata

  • with_task_annotations_status

Returns

create(dataset_name, labels=None, attributes=None, ontology_ids=None, driver=None, driver_id=None, checkout=False, expiration_options: Optional[dtlpy.entities.dataset.ExpirationOptions] = None) dtlpy.entities.dataset.Dataset[source]

Create a new dataset

Parameters
  • dataset_name – name

  • labels – dictionary of {tag: color} or list of label entities

  • attributes – dataset’s ontology’s attributes

  • ontology_ids – optional - dataset ontology

  • driver – optional - storage driver Driver object or driver name

  • driver_id – optional - driver id

  • checkout – bool. cache the dataset to work locally

  • expiration_options – dl.ExpirationOptions object that contain definitions for dataset like MaxItemDays

Returns

Dataset object

delete(dataset_name=None, dataset_id=None, sure=False, really=False)[source]

Delete a dataset forever! :param dataset_name: optional - search by name :param dataset_id: optional - search by id :param sure: are you sure you want to delete? :param really: really really? :return: True

directory_tree(dataset: Optional[dtlpy.entities.dataset.Dataset] = None, dataset_name=None, dataset_id=None)[source]

Get dataset’s directory tree :param dataset: :param dataset_name: :param dataset_id: :return:

static download_annotations(dataset, local_path=None, filters: Optional[dtlpy.entities.filters.Filters] = None, annotation_options: Optional[dtlpy.entities.annotation.ViewAnnotationOptions] = None, annotation_filters: Optional[dtlpy.entities.filters.Filters] = None, overwrite=False, thickness=1, with_text=False, remote_path=None, include_annotations_in_output=True, export_png_files=False, filter_output_annotations=False, alpha=None)[source]

Download dataset’s annotations by filters. Filtering the dataset both for items and for annotations and download annotations Optional - also download annotations as: mask, instance, image mask of the item

Parameters
  • dataset – dataset to download from

  • local_path – local folder or filename to save to.

  • filters – Filters entity or a dictionary containing filters parameters

  • annotation_options – download annotations options: list(dl.ViewAnnotationOptions)

  • annotation_filters – Filters entity to filter annotations for download

  • overwrite – optional - default = False

  • thickness – optional - line thickness, if -1 annotation will be filled, default =1

  • with_text – optional - add text to annotations, default = False

  • remote_path – DEPRECATED and ignored

  • include_annotations_in_output – default - False , if export should contain annotations

  • export_png_files – default - True, if semantic annotations should exported as png files

  • filter_output_annotations – default - False, given an export by filter - determine if to filter out annotations

  • alpha – opacity value [0 1], default 1

Returns

List of local_path per each downloaded item

get(dataset_name=None, dataset_id=None, checkout=False, fetch=None) dtlpy.entities.dataset.Dataset[source]

Get dataset by name or id

Parameters
  • dataset_name – optional - search by name

  • dataset_id – optional - search by id

  • checkout

  • fetch – optional - fetch entity from platform, default taken from cookie

Returns

Dataset object

list(name=None, creator=None) dtlpy.miscellaneous.list_print.List[dtlpy.entities.dataset.Dataset][source]

List all datasets. :param name: :param creator: :return: List of datasets

merge(merge_name, dataset_ids, project_ids, with_items_annotations=True, with_metadata=True, with_task_annotations_status=True, wait=True)[source]

merge a dataset

Parameters
  • merge_name – to clone dataset

  • dataset_ids – new dataset name

  • project_ids – Filters entity or a query dict

  • with_items_annotations

  • with_metadata

  • with_task_annotations_status

  • wait – wait the command to finish

Returns

open_in_web(dataset_name=None, dataset_id=None, dataset=None)[source]
Parameters
  • dataset_name

  • dataset_id

  • dataset

set_readonly(state: bool, dataset: dtlpy.entities.dataset.Dataset)[source]

Set dataset readonly mode :param state: :param dataset: :return:

sync(dataset_id, wait=True)[source]

Sync dataset with external storage

Parameters
  • dataset_id – to sync dataset

  • wait – wait the command to finish

Returns

update(dataset: dtlpy.entities.dataset.Dataset, system_metadata=False, patch: Optional[dict] = None) dtlpy.entities.dataset.Dataset[source]

Update dataset field :param dataset: Dataset entity :param system_metadata: bool - True, if you want to change metadata system :param patch: Specific patch request :return: Dataset object

upload_annotations(dataset, local_path, filters: Optional[dtlpy.entities.filters.Filters] = None, clean=False, remote_root_path='/')[source]

Upload annotations to dataset. :param dataset: dataset to upload to it :param local_path: str - local folder where the annotations files is. :param filters: Filters entity or a dictionary containing filters parameters :param clean: bool - if True it remove the old annotations :param remote_root_path: str - the remote root path to match remote and local items For example, if the item filepath is a/b/item and remote_root_path is /a the start folder will be b instead of a :return:

Drivers

class Drivers(client_api: dtlpy.services.api_client.ApiClient, project: Optional[dtlpy.entities.project.Project] = None)[source]

Bases: object

Drivers repository

create(name, driver_type, integration_id, bucket_name, project_id=None, allow_external_delete=True, region=None, storage_class='', path='')[source]
Parameters
  • name – the driver name

  • driver_type – ExternalStorage.S3, ExternalStorage.GCS , ExternalStorage.AZUREBLOB

  • integration_id – the integration id

  • bucket_name – the external bucket name

  • project_id

  • allow_external_delete

  • region – rilevante only for s3 - the bucket region

  • storage_class – rilevante only for s3

  • path – Optional. By default path is the root folder. Path is case sensitive integration

Returns

driver object

get(driver_name: Optional[str] = None, driver_id: Optional[str] = None) dtlpy.entities.driver.Driver[source]

Get a Driver object :param driver_name: optional - search by name :param driver_id: optional - search by id :return: Driver object

list() dtlpy.miscellaneous.list_print.List[dtlpy.entities.driver.Driver][source]

Get project’s drivers list. :return: List of Drivers objects

Items

class Items(client_api: dtlpy.services.api_client.ApiClient, datasets: Optional[dtlpy.repositories.datasets.Datasets] = None, dataset: Optional[dtlpy.entities.dataset.Dataset] = None, dataset_id=None, items_entity=None)[source]

Bases: object

Items repository

clone(item_id, dst_dataset_id, remote_filepath=None, metadata=None, with_annotations=True, with_metadata=True, with_task_annotations_status=False, allow_many=False, wait=True)[source]

Clone item :param item_id: item to clone :param dst_dataset_id: destination dataset id :param remote_filepath: complete filepath :param metadata: new metadata to add :param with_annotations: clone annotations :param with_metadata: clone metadata :param with_task_annotations_status: clone task annotations status :param allow_many: bool if True use multiple clones in single dataset is allowed, (default=False) :param wait: wait the command to finish :return: Item

delete(filename=None, item_id=None, filters: Optional[dtlpy.entities.filters.Filters] = None)[source]

Delete item from platform

Parameters
  • filename – optional - search item by remote path

  • item_id – optional - search item by id

  • filters – optional - delete items by filter

Returns

True

download(filters: Optional[dtlpy.entities.filters.Filters] = None, items=None, local_path=None, file_types=None, save_locally=True, to_array=False, annotation_options: Optional[dtlpy.entities.annotation.ViewAnnotationOptions] = None, annotation_filters: Optional[dtlpy.entities.filters.Filters] = None, overwrite=False, to_items_folder=True, thickness=1, with_text=False, without_relative_path=None, avoid_unnecessary_annotation_download=False, include_annotations_in_output=True, export_png_files=False, filter_output_annotations=False, alpha=None)[source]

Download dataset by filters. Filtering the dataset for items and save them local Optional - also download annotation, mask, instance and image mask of the item

Parameters
  • filters – Filters entity or a dictionary containing filters parameters

  • items – download Item entity or item_id (or a list of item)

  • local_path – local folder or filename to save to.

  • file_types – a list of file type to download. e.g [‘video/webm’, ‘video/mp4’, ‘image/jpeg’, ‘image/png’]

  • save_locally – bool. save to disk or return a buffer

  • to_array – returns Ndarray when True and local_path = False

  • annotation_options – download annotations options: list(dl.ViewAnnotationOptions)

  • annotation_filters – Filters entity to filter annotations for download

  • overwrite – optional - default = False

  • to_items_folder – Create ‘items’ folder and download items to it

  • thickness – optional - line thickness, if -1 annotation will be filled, default =1

  • with_text – optional - add text to annotations, default = False

  • without_relative_path – bool - download items without the relative path from platform

  • avoid_unnecessary_annotation_download – default - False

  • include_annotations_in_output – default - False , if export should contain annotations

  • export_png_files – default - True, if semantic annotations should exported as png files

  • filter_output_annotations – default - False, given an export by filter - determine if to filter out annotations

  • alpha – opacity value [0 1], default 1

Returns

List of local_path per each downloaded item

get(filepath=None, item_id=None, fetch=None, is_dir=False) dtlpy.entities.item.Item[source]

Get Item object

Parameters
  • filepath – optional - search by remote path

  • item_id – optional - search by id

  • fetch – optional - fetch entity from platform, default taken from cookie

  • is_dir – True if you want to get an item from dir type

Returns

Item object

get_all_items()[source]

Get all items in dataset :param filters: dl.Filters entity to filters items :return: list of all items

list(filters: Optional[dtlpy.entities.filters.Filters] = None, page_offset=None, page_size=None) dtlpy.entities.paged_entities.PagedEntities[source]

List items

Parameters
  • filters – Filters entity or a dictionary containing filters parameters

  • page_offset – start page

  • page_size – page size

Returns

Pages object

make_dir(directory, dataset: Optional[dtlpy.entities.dataset.Dataset] = None) dtlpy.entities.item.Item[source]

Create a directory in a dataset

Parameters
  • directory – name of directory

  • dataset – optional

Returns

move_items(destination, filters: Optional[dtlpy.entities.filters.Filters] = None, items=None, dataset: Optional[dtlpy.entities.dataset.Dataset] = None) bool[source]

Move items to another directory.

If directory does not exist we will create it

Parameters
  • destination – destination directory

  • filters – optional - either this or items. Query of items to move

  • items – optional - either this or filters. A list of items to move

  • dataset – optional

Returns

True if success

open_in_web(filepath=None, item_id=None, item=None)[source]
Parameters
  • filepath – item file path

  • item_id – item id

  • item – item entity

update(item: Optional[dtlpy.entities.item.Item] = None, filters: Optional[dtlpy.entities.filters.Filters] = None, update_values=None, system_update_values=None, system_metadata=False)[source]

Update items metadata :param item: Item object :param filters: optional update filtered items by given filter :param update_values: optional field to be updated and new values :param system_update_values: values in system metadata to be updated :param system_metadata: bool - True, if you want to change metadata system :return: Item object

update_status(status: dtlpy.entities.item.ItemStatus, items=None, item_ids=None, filters=None, dataset=None, clear=False)[source]
Parameters
  • status – ItemStatus.COMPLETED, ItemStatus.APPROVED, ItemStatus.DISCARDED

  • items

  • item_ids

  • filters

  • dataset

  • clear

upload(local_path, local_annotations_path=None, remote_path='/', remote_name=None, file_types=None, overwrite=False, item_metadata=None, output_entity=<class 'dtlpy.entities.item.Item'>, no_output=False)[source]

Upload local file to dataset. Local filesystem will remain. If “*” at the end of local_path (e.g. “/images/*”) items will be uploaded without head directory

Parameters
  • local_path – list of local file, local folder, BufferIO, numpy.ndarray or url to upload

  • local_annotations_path – path to dataloop format annotations json files.

  • remote_path – remote path to save.

  • remote_name – remote base name to save. when upload numpy.ndarray as local path, remote_name with .jpg or .png ext is mandatory

  • file_types – list of file type to upload. e.g [‘.jpg’, ‘.png’]. default is all

  • item_metadata

  • overwrite – optional - default = False

  • output_entity – output type

  • no_output – do not return the items after upload

Returns

Output (list/single item)

Annotations

class Annotations(client_api: dtlpy.services.api_client.ApiClient, item=None, dataset=None, dataset_id=None)[source]

Bases: object

Annotations repository

delete(annotation=None, annotation_id=None, filters: Optional[dtlpy.entities.filters.Filters] = None)[source]

Remove an annotation from item

Parameters
  • annotation – Annotation object

  • annotation_id – annotation id

  • filters – Filters entity or a dictionary containing filters parameters

Returns

True/False

download(filepath, annotation_format: dtlpy.entities.annotation.ViewAnnotationOptions = ViewAnnotationOptions.MASK, img_filepath=None, height=None, width=None, thickness=1, with_text=False, alpha=None)[source]

Save annotation format to file

Parameters
  • filepath – Target download directory

  • annotation_format – optional - list(dl.ViewAnnotationOptions)

  • img_filepath – img file path - needed for img_mask

  • height – optional - image height

  • width – optional - image width

  • thickness – optional - annotation format, default =1

  • with_text – optional - draw annotation with text, default = False

  • alpha – opacity value [0 1], default 1

Returns

get(annotation_id)[source]

Get a single annotation

Parameters

annotation_id

Returns

Annotation object or None

list(filters: Optional[dtlpy.entities.filters.Filters] = None, page_offset=None, page_size=None)[source]

List Annotation

Parameters
  • filters – Filters entity or a dictionary containing filters parameters

  • page_offset – starting page

  • page_size – size of page

Returns

Pages object

show(image=None, thickness=1, with_text=False, height=None, width=None, annotation_format: dtlpy.entities.annotation.ViewAnnotationOptions = ViewAnnotationOptions.MASK, alpha=None)[source]

Show annotations

Parameters
  • image – empty or image to draw on

  • thickness – line thickness

  • with_text – add label to annotation

  • height – height

  • width – width

  • annotation_format – options: list(dl.ViewAnnotationOptions)

  • alpha – opacity value [0 1], default 1

Returns

ndarray of the annotations

update(annotations, system_metadata=False)[source]

Update an existing annotation.

Parameters
  • annotations – annotations object

  • system_metadata – bool - True, if you want to change metadata system

Returns

True

update_status(annotation: Optional[dtlpy.entities.annotation.Annotation] = None, annotation_id=None, status: dtlpy.entities.annotation.AnnotationStatus = AnnotationStatus.ISSUE)[source]

Set status on annotation

Parameters
  • annotation – optional - Annotation entity

  • annotation_id – optional - annotation id to set status

  • status – can be AnnotationStatus.ISSUE, AnnotationStatus.APPROVED, AnnotationStatus.REVIEW, AnnotationStatus.CLEAR

Returns

Annotation object

upload(annotations)[source]

Create a new annotation

Parameters

annotations – list or single annotation of type Annotation

Returns

list of annotation objects

Recipes

class Recipes(client_api: dtlpy.services.api_client.ApiClient, dataset: Optional[dtlpy.entities.dataset.Dataset] = None, project: Optional[dtlpy.entities.project.Project] = None, project_id: Optional[str] = None)[source]

Bases: object

Items repository

clone(recipe: Optional[dtlpy.entities.recipe.Recipe] = None, recipe_id=None, shallow=False)[source]

Clone Recipe

Parameters
  • recipe – Recipe object

  • recipe_id – Recipe id

  • shallow – If True, link ot existing ontology, clones all ontology that are link to the recipe as well

Returns

Cloned ontology object

create(project_ids=None, ontology_ids=None, labels=None, recipe_name=None, attributes=None) dtlpy.entities.recipe.Recipe[source]

Create New Recipe

if ontology_ids is None an ontology will be created first :param project_ids: :param ontology_ids: :param labels: :param recipe_name: :param attributes:

delete(recipe_id)[source]

Delete recipe from platform

Parameters

recipe_id – recipe id

Returns

True

get(recipe_id) dtlpy.entities.recipe.Recipe[source]

Get Recipe object

Parameters

recipe_id – recipe id

Returns

Recipe object

list(filters: Optional[dtlpy.entities.filters.Filters] = None) dtlpy.miscellaneous.list_print.List[dtlpy.entities.recipe.Recipe][source]

List recipes for dataset :param filters:

open_in_web(recipe=None, recipe_id=None)[source]
Parameters
  • recipe

  • recipe_id

update(recipe: dtlpy.entities.recipe.Recipe, system_metadata=False) dtlpy.entities.recipe.Recipe[source]

Update items metadata

Parameters
  • recipe – Recipe object

  • system_metadata – bool - True, if you want to change metadata system

Returns

Recipe object

Ontologies

class Ontologies(client_api: dtlpy.services.api_client.ApiClient, recipe: Optional[dtlpy.entities.recipe.Recipe] = None, project: Optional[dtlpy.entities.project.Project] = None, dataset: Optional[dtlpy.entities.dataset.Dataset] = None)[source]

Bases: object

Ontologies repository

create(labels, title=None, project_ids=None, attributes=None) dtlpy.entities.ontology.Ontology[source]

Create a new ontology

Parameters
  • labels – recipe tags

  • title – ontology title, name

  • project_ids – recipe project/s

  • attributes – recipe attributes

Returns

Ontology object

delete(ontology_id)[source]

Delete Ontology from platform

Parameters

ontology_id – ontology_id id

Returns

True

get(ontology_id) dtlpy.entities.ontology.Ontology[source]

Get Ontology object

Parameters

ontology_id – ontology id

Returns

Ontology object

static labels_to_roots(labels)[source]

Converts labels dict to a list of platform representation of labels

Parameters

labels – labels dict

Returns

platform representation of labels

list(project_ids=None) dtlpy.miscellaneous.list_print.List[dtlpy.entities.ontology.Ontology][source]

List ontologies for recipe :param project_ids: :return:

update(ontology: dtlpy.entities.ontology.Ontology, system_metadata=False) dtlpy.entities.ontology.Ontology[source]

Update Ontology metadata

Parameters
  • ontology – Ontology object

  • system_metadata – bool - True, if you want to change metadata system

Returns

Ontology object

Tasks

class Tasks(client_api: dtlpy.services.api_client.ApiClient, project: Optional[dtlpy.entities.project.Project] = None, dataset: Optional[dtlpy.entities.dataset.Dataset] = None, project_id: Optional[str] = None)[source]

Bases: object

Tasks repository

add_items(task: Optional[dtlpy.entities.task.Task] = None, task_id=None, filters: Optional[dtlpy.entities.filters.Filters] = None, items=None, assignee_ids=None, query=None, workload=None, limit=None, wait=True) dtlpy.entities.task.Task[source]

Add items to Task

:param task :param task_id: :param filters: :param items: :param assignee_ids: :param query: :param workload: :param limit: :param wait: wait the command to finish :return:

create(task_name, due_date=None, assignee_ids=None, workload=None, dataset=None, task_owner=None, task_type='annotation', task_parent_id=None, project_id=None, recipe_id=None, assignments_ids=None, metadata=None, filters=None, items=None, query=None, available_actions=None, wait=True, check_if_exist: dtlpy.entities.filters.Filters = False) dtlpy.entities.task.Task[source]

Create a new Annotation Task

Parameters
  • task_name

  • due_date

  • assignee_ids

  • workload

  • dataset

  • task_owner

  • task_type – “annotation” or “qa”

  • task_parent_id – optional if type is qa - parent task id

  • project_id

  • recipe_id

  • assignments_ids

  • metadata

  • filters

  • items

  • query

  • available_actions

  • wait – wait the command to finish

  • check_if_exist – dl.Filters check if task exist according to filter

Returns

Annotation Task object

create_qa_task(task, assignee_ids, due_date=None, filters=None, items=None, query=None) dtlpy.entities.task.Task[source]
Parameters
  • task

  • assignee_ids

  • due_date

  • filters

  • items

  • query

delete(task: Optional[dtlpy.entities.task.Task] = None, task_name=None, task_id=None, wait=True)[source]

Delete an Annotation Task :param task: :param task_name: :param task_id: :param wait: wait the command to finish :return: True

get(task_name=None, task_id=None) dtlpy.entities.task.Task[source]

Get an Annotation Task object :param task_name: optional - search by name :param task_id: optional - search by id :return: task_id object

get_items(task_id: Optional[str] = None, task_name: Optional[str] = None, dataset: Optional[dtlpy.entities.dataset.Dataset] = None, filters: Optional[dtlpy.entities.filters.Filters] = None) dtlpy.entities.paged_entities.PagedEntities[source]
Parameters
  • task_id

  • task_name

  • dataset

  • filters

Returns

list(project_ids=None, status=None, task_name=None, pages_size=None, page_offset=None, recipe=None, creator=None, assignments=None, min_date=None, max_date=None, filters: Optional[dtlpy.entities.filters.Filters] = None) Union[dtlpy.miscellaneous.list_print.List[dtlpy.entities.task.Task], dtlpy.entities.paged_entities.PagedEntities][source]

Get Annotation Task list :param project_ids: list of project ids :param status: :param task_name: task name :param pages_size: :param page_offset: :param recipe: :param creator: :param assignments: assignments :param min_date:double :param max_date: double :param filters: dl.Filters entity to filters items :return: List of Annotation Task objects

open_in_web(task_name=None, task_id=None, task=None)[source]
Parameters
  • task_name

  • task_id

  • task

query(filters=None, project_ids=None)[source]
Parameters
  • filters

  • project_ids

set_status(status: str, operation: str, task_id: str, item_ids: List[str])[source]

Update item status within task

Parameters
  • status – str - string the describes the status

  • operation – str - ‘create’ or ‘delete’

  • task_id – str - task id

  • item_ids – List[str]

:return : Boolean

update(task: Optional[dtlpy.entities.task.Task] = None, system_metadata=False) dtlpy.entities.task.Task[source]

Update an Annotation Task :param task: task entity :param system_metadata: True, if you want to change metadata system :return: Annotation Task object

Assignments

class Assignments(client_api: dtlpy.services.api_client.ApiClient, project: Optional[dtlpy.entities.project.Project] = None, task: Optional[dtlpy.entities.task.Task] = None, dataset: Optional[dtlpy.entities.dataset.Dataset] = None, project_id=None)[source]

Bases: object

Assignments repository

create(assignee_id, task=None, filters=None, items=None) dtlpy.entities.assignment.Assignment[source]

Create a new assignment :param assignee_id: the assignee for the assignment :param task: task entity :param filters: Filters entity or a dictionary containing filters parameters :param items: list of items :return: Assignment object

get(assignment_name=None, assignment_id=None)[source]

Get a Project object :param assignment_name: optional - search by name :param assignment_id: optional - search by id :return: Project object

get_items(assignment: Optional[dtlpy.entities.assignment.Assignment] = None, assignment_id=None, assignment_name=None, dataset=None, filters=None) dtlpy.entities.paged_entities.PagedEntities[source]

Get all the items in the assignment :param assignment: assignment entity :param assignment_id: assignment id :param assignment_name: assignment name :param dataset: dataset entity :param filters: Filters entity or a dictionary containing filters parameters :return:

list(project_ids=None, status=None, assignment_name=None, assignee_id=None, pages_size=None, page_offset=None, task_id=None) dtlpy.miscellaneous.list_print.List[dtlpy.entities.assignment.Assignment][source]

Get Assignments list

Parameters
  • project_ids – list of project ids

  • status

  • assignment_name

  • assignee_id

  • pages_size

  • page_offset

  • task_id

Returns

List of Assignment objects

open_in_web(assignment_name=None, assignment_id=None, assignment=None)[source]
Parameters
  • assignment_name

  • assignment_id

  • assignment

reassign(assignee_id, assignment=None, assignment_id=None, task=None, task_id=None, wait=True)[source]

Reassign an assignment :param assignee_id: :param assignment: :param assignment_id: :param task: :param task_id: :param wait: wait the command to finish :return: Assignment object

redistribute(workload, assignment=None, assignment_id=None, task=None, task_id=None, wait=True)[source]

Redistribute an assignment :param workload: :param assignment: :param assignment_id: :param task: :param task_id: :param wait: wait the command to finish :return: Assignment object

set_status(status: str, operation: str, item_id: str, assignment_id: str)[source]

Set item status within assignment @param status: str @param operation: created/deleted @param item_id: str @param assignment_id: str @return: Boolean

update(assignment: Optional[dtlpy.entities.assignment.Assignment] = None, system_metadata=False) dtlpy.entities.assignment.Assignment[source]

Update an assignment :param assignment: assignment entity :param system_metadata: bool - True, if you want to change metadata system :return: Assignment object

Packages

class LocalServiceRunner(client_api: dtlpy.services.api_client.ApiClient, packages, cwd=None, multithreading=False, concurrency=10, package: Optional[dtlpy.entities.package.Package] = None, module_name='default_module', function_name='run', class_name='ServiceRunner', entry_point='main.py', mock_file_path=None)[source]

Bases: object

Service Runner Class

get_field(field_name, field_type, mock_json, project=None, mock_inputs=None)[source]

Get field in mock json :param field_name: :param field_type: :param mock_json: :param project: :param mock_inputs: :return:

get_mainpy_run_service()[source]

Get mainpy run service :return:

run_local_project(project=None)[source]
Parameters

project

class Packages(client_api: dtlpy.services.api_client.ApiClient, project: Optional[dtlpy.entities.project.Project] = None)[source]

Bases: object

Packages Repository

build_requirements(filepath) dtlpy.repositories.packages.Packages.list[source]

build a requirements list from file path :param filepath: path of the requirements file :return: a list of dl.PackageRequirement

static build_trigger_dict(actions, name='default_module', filters=None, function='run', execution_mode='Once', type_t='Event')[source]

build trigger dict :param actions: :param name: :param filters: :param function: :param execution_mode: :param type_t:

static check_cls_arguments(cls, missing, function_name, function_inputs)[source]
Parameters
  • cls

  • missing

  • function_name

  • function_inputs

checkout(package=None, package_id=None, package_name=None)[source]

Checkout as package :param package: :param package_id: :param package_name: :return:

delete(package: Optional[dtlpy.entities.package.Package] = None, package_name=None, package_id=None)[source]

Delete Package object

Parameters
  • package

  • package_name

  • package_id

Returns

True

deploy(package_id=None, package_name=None, package=None, service_name=None, project_id=None, revision=None, init_input=None, runtime=None, sdk_version=None, agent_versions=None, bot=None, pod_type=None, verify=True, checkout=False, module_name=None, run_execution_as_process=None, execution_timeout=None, drain_time=None, on_reset=None, max_attempts=None, force=False, **kwargs) dtlpy.entities.service.Service[source]

Deploy package :param package_id: :param package_name: :param package: :param service_name: :param project_id: :param revision: :param init_input: :param runtime: :param sdk_version: - optional - string - sdk version :param agent_versions: - dictionary - - optional -versions of sdk, agent runner and agent proxy :param bot: :param pod_type: :param verify: :param checkout: :param module_name: :param run_execution_as_process: :param execution_timeout: :param drain_time: :param on_reset: :param max_attempts: Maximum execution retries in-case of a service reset :param force: optional - terminate old replicas immediately :return:

deploy_from_file(project, json_filepath)[source]
Parameters
  • project

  • json_filepath

static generate(name=None, src_path=None, service_name=None, package_type='default_package_type')[source]

Generate new package environment :param name: :param src_path: :param service_name: :param package_type: :return:

get(package_name=None, package_id=None, checkout=False, fetch=None) dtlpy.entities.package.Package[source]

Get Package object :param package_name: :param package_id: :param checkout: bool :param fetch: optional - fetch entity from platform, default taken from cookie :return: Package object

list(filters: Optional[dtlpy.entities.filters.Filters] = None, project_id=None) dtlpy.entities.paged_entities.PagedEntities[source]

List project packages :param filters: :param project_id: :return:

open_in_web(package=None, package_id=None, package_name=None)[source]
Parameters
  • package

  • package_id

  • package_name

pull(package: dtlpy.entities.package.Package, version=None, local_path=None, project_id=None)[source]
Parameters
  • package

  • version

  • local_path

  • project_id

Returns

push(project: Optional[dtlpy.entities.project.Project] = None, project_id: Optional[str] = None, package_name: Optional[str] = None, src_path: Optional[str] = None, codebase: Optional[Union[dtlpy.entities.codebase.GitCodebase, dtlpy.entities.codebase.ItemCodebase, dtlpy.entities.codebase.FilesystemCodebase]] = None, modules: Optional[List[dtlpy.entities.package_module.PackageModule]] = None, is_global: Optional[bool] = None, checkout: bool = False, revision_increment: Optional[str] = None, version: Optional[str] = None, ignore_sanity_check: bool = False, service_update: bool = False, service_config: Optional[dict] = None, slots: Optional[List[dtlpy.entities.package_slot.PackageSlot]] = None, requirements: Optional[List[dtlpy.entities.package.PackageRequirement]] = None) dtlpy.entities.package.Package[source]

Push local package. Project will be taken in the following hierarchy: project(input) -> project_id(input) -> self.project(context) -> checked out

Parameters
  • project – optional - project entity to deploy to. default from context or checked-out

  • project_id – optional - project id to deploy to. default from context or checked-out

  • package_name – package name

  • src_path – path to package codebase

  • codebase

  • modules – list of modules PackageModules of the package

  • is_global

  • checkout – checkout package to local dir

  • revision_increment – optional - str - version bumping method - major/minor/patch - default = None

  • version – semver version f the package

  • ignore_sanity_check – NOT RECOMMENDED - skip code sanity check before pushing

  • service_update – optional - bool - update the service

  • service_config – json of service - a service that have config from the main service if wanted

  • slots – optional - list of slots PackageSlot of the package

  • requirements – requirements - list of package requirements

Returns

revisions(package: Optional[dtlpy.entities.package.Package] = None, package_id=None)[source]

Get package revisions history

Parameters
  • package – Package entity

  • package_id – package id

test_local_package(cwd=None, concurrency=None, package: Optional[dtlpy.entities.package.Package] = None, module_name='default_module', function_name='run', class_name='ServiceRunner', entry_point='main.py', mock_file_path=None)[source]

Test local package :param cwd: str - path to the file :param concurrency: int -the concurrency of the test :param package: entities.package :param module_name: str - module name :param function_name: str - function name :param class_name: str - class name :param entry_point: str - the file to run like main.py :param mock_file_path: str - the mock file that have the inputs :return:

update(package: dtlpy.entities.package.Package, revision_increment: Optional[str] = None) dtlpy.entities.package.Package[source]

Update Package changes to platform :param package: :param revision_increment: optional - str - version bumping method - major/minor/patch - default = None :return: Package entity

Codebases

class Codebases(client_api: dtlpy.services.api_client.ApiClient, project: Optional[dtlpy.entities.project.Project] = None, dataset: Optional[dtlpy.entities.dataset.Dataset] = None, project_id: Optional[str] = None)[source]

Bases: object

Codebase repository

clone_git(codebase, local_path)[source]
Parameters
  • codebase

  • local_path

get(codebase_name=None, codebase_id=None, version=None)[source]

Get a Codebase object :param codebase_name: optional - search by name :param codebase_id: optional - search by id :param version: codebase version. default is latest. options: “all”, “latest” or ver number - “10” :return: Codebase object

static get_current_version(all_versions_pages, zip_md)[source]
Parameters
  • all_versions_pages

  • zip_md

list() dtlpy.entities.paged_entities.PagedEntities[source]

List all code bases :return: Paged entity

list_versions(codebase_name)[source]

List all codebase versions

Parameters

codebase_name – code base name

Returns

list of versions

pack(directory, name=None, description='')[source]

Zip a local code directory and post to codebases :param directory: local directory to pack :param name: codebase name :param description: codebase description :return: Codebase object

pull_git(codebase, local_path)[source]
Parameters
  • codebase

  • local_path

unpack(codebase: Optional[dtlpy.entities.codebase.Codebase] = None, codebase_name=None, codebase_id=None, local_path=None, version=None)[source]

Unpack codebase locally. Download source code and unzip :param codebase: dl.Codebase object :param codebase_name: search by name :param codebase_id: search by id :param local_path: local path to save codebase :param version: codebase version to unpack. default - latest :return: String (dirpath)

Services

class ServiceLog(_json: dict, service: dtlpy.entities.service.Service, services: dtlpy.repositories.services.Services, start=None, follow=None, execution_id=None, function_name=None, replica_id=None, system=False)[source]

Bases: object

Service Log

view(until_completed)[source]
Parameters

until_completed

Bots

class Bots(client_api: dtlpy.services.api_client.ApiClient, project: dtlpy.entities.project.Project)[source]

Bases: object

Bots repository

create(name, return_credentials: bool = False)[source]

Create a new Bot :param name: :param return_credentials: with True well return the password when create :return: Bot object

delete(bot_id=None, bot_email=None)[source]

Delete a Bot :param bot_id: bot id to delete :param bot_email: bot email to delete :return: True

get(bot_email=None, bot_id=None, bot_name=None)[source]

Get a Bot object :param bot_email: get bot by email :param bot_id: get bot by id :param bot_name: get bot by name :return: Bot object

list() dtlpy.miscellaneous.list_print.List[dtlpy.entities.bot.Bot][source]

Get project’s bots list. :return: List of Bots objects

Triggers

class Triggers(client_api: dtlpy.services.api_client.ApiClient, project: Optional[dtlpy.entities.project.Project] = None, service: Optional[dtlpy.entities.service.Service] = None, project_id: Optional[str] = None, pipeline: Optional[dtlpy.entities.pipeline.Pipeline] = None)[source]

Bases: object

Triggers repository

create(service_id: Optional[str] = None, trigger_type: dtlpy.entities.trigger.TriggerType = TriggerType.EVENT, name: Optional[str] = None, webhook_id=None, function_name='run', project_id=None, active=True, filters=None, resource: dtlpy.entities.trigger.TriggerResource = TriggerResource.ITEM, actions: Optional[dtlpy.entities.trigger.TriggerAction] = None, execution_mode: dtlpy.entities.trigger.TriggerExecutionMode = TriggerExecutionMode.ONCE, start_at=None, end_at=None, inputs=None, cron=None, pipeline_id=None, pipeline=None, pipeline_node_id=None, root_node_namespace=None, **kwargs) dtlpy.entities.trigger.BaseTrigger[source]

Create a Trigger. Can create two types: a cron trigger or an event trigger. Inputs are different for each type

Inputs for all types:

Parameters
  • service_id – Id of services to be triggered

  • trigger_type – can be cron or event. use enum dl.TriggerType for the full list

  • name – name of the trigger

  • webhook_id – id for webhook to be called

  • function_name – the function name to be called when triggered. must be defined in the package

  • project_id – project id where trigger will work

  • active – optional - True/False, default = True

Inputs for event trigger: :param filters: optional - Item/Annotation metadata filters, default = none :param resource: optional - Dataset/Item/Annotation/ItemStatus, default = Item :param actions: optional - Created/Updated/Deleted, default = create :param execution_mode: how many time trigger should be activate. default is “Once”. enum dl.TriggerExecutionMode

Inputs for cron trigger: :param start_at: iso format date string to start activating the cron trigger :param end_at: iso format date string to end the cron activation :param inputs: dictionary “name”:”val” of inputs to the function :param cron: cron spec specifying when it should run. more information: https://en.wikipedia.org/wiki/Cron :param pipeline_id: Id of pipeline to be triggered :param pipeline: pipeline entity to be triggered :param pipeline_node_id: Id of pipeline root node to be triggered :param root_node_namespace: namespace of pipeline root node to be triggered

Returns

Trigger entity

delete(trigger_id=None, trigger_name=None)[source]

Delete Trigger object

Parameters
  • trigger_id

  • trigger_name

Returns

True

get(trigger_id=None, trigger_name=None) dtlpy.entities.trigger.BaseTrigger[source]

Get Trigger object :param trigger_id: :param trigger_name: :return: Trigger object

list(filters: Optional[dtlpy.entities.filters.Filters] = None) dtlpy.entities.paged_entities.PagedEntities[source]

List project packages :param filters: :return:

name_validation(name: str)[source]
Parameters

name

resource_information(resource, resource_type, action='Created')[source]

return which function should run on a item (based on global triggers)

Parameters
  • resource – ‘Item’ / ‘Dataset’ / etc

  • resource_type – dictionary of the resource object

  • action – ‘Created’ / ‘Updated’ / etc.

update(trigger: dtlpy.entities.trigger.BaseTrigger) dtlpy.entities.trigger.BaseTrigger[source]
Parameters

trigger – Trigger entity

Returns

Trigger entity

Executions

class Executions(client_api: dtlpy.services.api_client.ApiClient, service: Optional[dtlpy.entities.service.Service] = None, project: Optional[dtlpy.entities.project.Project] = None)[source]

Bases: object

Service Executions repository

create(service_id=None, execution_input=None, function_name=None, resource: Optional[dtlpy.entities.package_function.PackageInputType] = None, item_id=None, dataset_id=None, annotation_id=None, project_id=None, sync=False, stream_logs=False, return_output=False, return_curl_only=False, timeout=None) dtlpy.entities.execution.Execution[source]

Execute a function on an existing service

Parameters
  • service_id – service id to execute on

  • execution_input – input dictionary or list of FunctionIO entities

  • function_name – function name to run

  • resource – input type.

  • item_id – optional - input to function

  • dataset_id – optional - input to function

  • annotation_id – optional - input to function

  • project_id – resource’s project

  • sync – wait for function to end

  • stream_logs – prints logs of the new execution. only works with sync=True

  • return_output – if True and sync is True - will return the output directly

  • return_curl_only – return the cURL of the creation WITHOUT actually do it

  • timeout – int, seconds to wait until TimeoutError is raised. if <=0 - wait until done - by default wait take the service timeout

Returns

get(execution_id=None, sync=False) dtlpy.entities.execution.Execution[source]

Get Service execution object

Parameters
  • execution_id

  • sync – wait for the execution to finish

Returns

Service execution object

increment(execution: dtlpy.entities.execution.Execution)[source]

Increment attempts :param execution: :return: int

list(filters: Optional[dtlpy.entities.filters.Filters] = None) dtlpy.entities.paged_entities.PagedEntities[source]

List service executions :param filters: dl.Filters entity to filters items :return:

logs(execution_id, follow=True, until_completed=True)[source]

executions logs :param execution_id: :param follow: :param until_completed: :return: executions logs

progress_update(execution_id: str, status: Optional[dtlpy.entities.execution.ExecutionStatus] = None, percent_complete: Optional[int] = None, message: Optional[str] = None, output: Optional[str] = None, service_version: Optional[str] = None)[source]

Update Execution Progress

Parameters
  • execution_id

  • status – ExecutionStatus

  • percent_complete

  • message

  • output

  • service_version

Returns

rerun(execution: dtlpy.entities.execution.Execution, sync: bool = False)[source]

Increment attempts :param execution: :param sync: :return: int

terminate(execution: dtlpy.entities.execution.Execution)[source]

Terminate Execution :param execution: :return:

update(execution: dtlpy.entities.execution.Execution) dtlpy.entities.execution.Execution[source]

Update execution changes to platform :param execution: execution entity :return: execution entity

wait(execution_id, timeout=None)[source]

Get Service execution object

Parameters
  • execution_id

  • timeout – int, seconds to wait until TimeoutError is raised. if <=0 - wait until done - by default wait take the service timeout

Returns

Service execution object

Pipelines

class Pipelines(client_api: dtlpy.services.api_client.ApiClient, project: Optional[dtlpy.entities.project.Project] = None)[source]

Bases: object

Pipelines Repository

create(name=None, project_id=None, pipeline_json=None) dtlpy.entities.pipeline.Pipeline[source]

Create a new pipeline :param name: str - pipeline name :param project_id: str - project id :param pipeline_json: dict - json contain the pipeline fields :return: Pipeline object

delete(pipeline: Optional[dtlpy.entities.pipeline.Pipeline] = None, pipeline_name=None, pipeline_id=None)[source]

Delete Pipeline object

Parameters
  • pipeline

  • pipeline_name

  • pipeline_id

Returns

True

execute(pipeline: Optional[dtlpy.entities.pipeline.Pipeline] = None, pipeline_id: Optional[str] = None, pipeline_name: Optional[str] = None, execution_input=None)[source]

execute a pipeline and return the execute :param pipeline: entities.Pipeline object :param pipeline_id: pipeline id :param pipeline_name: pipeline name :param execution_input: list of the dl.FunctionIO or dict of pipeline input - example {‘item’: ‘item_id’} :return: entities.PipelineExecution object

get(pipeline_name=None, pipeline_id=None, fetch=None) dtlpy.entities.pipeline.Pipeline[source]

Get Pipeline object

Parameters
  • pipeline_name – str

  • pipeline_id – str

  • fetch – optional - fetch entity from platform, default taken from cookie

Returns

Pipeline object

install(pipeline: Optional[dtlpy.entities.pipeline.Pipeline] = None)[source]

install a pipeline :param pipeline: :return: Composition object

list(filters: Optional[dtlpy.entities.filters.Filters] = None, project_id=None) dtlpy.entities.paged_entities.PagedEntities[source]

List project pipelines :param filters: :param project_id: :return:

open_in_web(pipeline=None, pipeline_id=None, pipeline_name=None)[source]
Parameters
  • pipeline

  • pipeline_id

  • pipeline_name

pause(pipeline: Optional[dtlpy.entities.pipeline.Pipeline] = None)[source]

pause a pipeline :param pipeline: :return: Composition object

update(pipeline: Optional[dtlpy.entities.pipeline.Pipeline] = None) dtlpy.entities.pipeline.Pipeline[source]

Update pipeline changes to platform

Parameters

pipeline

Returns

pipeline entity

Pipeline Executions

class PipelineExecutions(client_api: dtlpy.services.api_client.ApiClient, project: Optional[dtlpy.entities.project.Project] = None, pipeline: Optional[dtlpy.entities.pipeline.Pipeline] = None)[source]

Bases: object

PipelineExecutions Repository

create(pipeline_id: Optional[str] = None, execution_input=None)[source]

execute a pipeline and return the execute :param pipeline_id: pipeline id :param execution_input: list of the dl.FunctionIO or dict of pipeline input - example {‘item’: ‘item_id’} :return: entities.PipelineExecution object

get(pipeline_execution_id: str, pipeline_id: Optional[str] = None) dtlpy.entities.pipeline.Pipeline[source]

Get Pipeline Execution object

Parameters
  • pipeline_execution_id – str

  • pipeline_id – str

Returns

Pipeline object

list(filters: Optional[dtlpy.entities.filters.Filters] = None) dtlpy.entities.paged_entities.PagedEntities[source]

List project pipeline executions :param filters: :return:

General Commands

class Commands(client_api: dtlpy.services.api_client.ApiClient)[source]

Bases: object

Service Commands repository

abort(command_id)[source]

Abort Command

:param command_id :return:

get(command_id=None, url=None) dtlpy.entities.command.Command[source]

Get Service command object

Parameters
  • command_id

  • url – command url

Returns

Command object

list()[source]

List of commands :return:

wait(command_id, timeout=0, step=5, url=None)[source]

Wait for command to finish

Parameters
  • command_id – Command id to wait to

  • timeout – int, seconds to wait until TimeoutError is raised. if 0 - wait until done

  • step – int, seconds between polling

  • url – url to the command

Returns

Command object

Download Commands

Upload Commands