Entities¶
Organization¶
- class Organization(members: list, groups: list, account: dict, created_at, updated_at, id, name, logo_url, plan, owner, created_by, client_api: ApiClient, repositories=_Nothing.NOTHING)[source]¶
Bases:
BaseEntity
Organization entity
- add_member(email, role: ~dtlpy.entities.organization.MemberOrgRole = <enum 'MemberOrgRole'>)[source]¶
Add members to your organization. Read about members and groups [here](https://dataloop.ai/docs/org-members-groups).
Prerequisities: To add members to an organization, you must be in the role of an “owner” in that organization.
- cache_action(mode=CacheAction.APPLY, pod_type=PodType.SMALL)[source]¶
Open the organizations in web platform
- delete_member(user_id: str, sure: bool = False, really: bool = False)[source]¶
Delete member from the Organization.
Prerequisites: Must be an organization “owner” to delete members.
- classmethod from_json(_json, client_api, is_fetched=True)[source]¶
Build a Project entity object from a json
- Parameters:
- Returns:
Organization object
- Return type:
- list_groups()[source]¶
List all organization groups (groups that were created within the organization).
Prerequisites: You must be an organization “owner” to use this method.
- Returns:
groups list
- Return type:
- list_members(role: Optional[MemberOrgRole] = None)[source]¶
List all organization members.
Prerequisites: You must be an organization “owner” to use this method.
- to_json()[source]¶
Returns platform _json format of object
- Returns:
platform json format of object
- Return type:
- update(plan: str)[source]¶
Update Organization.
Prerequisities: You must be an Organization superuser to update an organization.
- Parameters:
plan (str) – OrganizationsPlans.FREEMIUM, OrganizationsPlans.PREMIUM
- Returns:
organization object
- update_member(email: str, role: MemberOrgRole = MemberOrgRole.MEMBER)[source]¶
Update member role.
Prerequisities: You must be an organization “owner” to update a member’s role.
Integration¶
- class Integration(id, name, type, org, created_at, creator, update_at, url, client_api: ApiClient, metadata=None, project=None)[source]¶
Bases:
BaseEntity
Integration object
- delete(sure: bool = False, really: bool = False) bool [source]¶
Delete integrations from the Organization
- classmethod from_json(_json: dict, client_api: ApiClient, is_fetched=True)[source]¶
Build a Integration entity object from a json
- Parameters:
_json – _json response from host
client_api – ApiClient entity
is_fetched – is Entity fetched from Platform
- Returns:
Integration object
- to_json()[source]¶
Returns platform _json format of object
- Returns:
platform json format of object
- Return type:
- update(new_name: Optional[str] = None, new_options: Optional[dict] = None)[source]¶
Update the integration’s name.
Prerequisites: You must be an owner in the organization.
- Parameters:
- Returns:
Integration object
- Return type:
Examples for options include: s3 - {key: “”, secret: “”}; gcs - {key: “”, secret: “”, content: “”}; azureblob - {key: “”, secret: “”, clientId: “”, tenantId: “”}; key_value - {key: “”, value: “”} aws-sts - {key: “”, secret: “”, roleArns: “”} aws-cross - {roleArns: “”}
Example:
project.integrations.update(integrations_id='integrations_id', new_name="new_integration_name")
- class IntegrationType(value)[source]¶
-
The type of the Integration.
State
Description
S3
S3 Integration - for S3 drivers
AWS_CROSS_ACCOUNT
AWS CROSS ACCOUNT Integration - for S3 drivers
AWS_STS
AWS STS Integration - for S3 drivers
GCS
GCS Integration - for GCS drivers
GCP_CROSS_PROJECT
GCP CROSS PROJECT Integration - for GCP drivers
AZUREBLOB
AZURE BLOB Integration - for S3 AZUREBLOB and AZURE_DATALAKE_GEN2 drivers
KEY_VALUE
KEY VALUE Integration - for save secrets in the platform
GCP_WORKLOAD_IDENTITY_FEDERATION
GCP Workload Identity Federation Integration - for GCP drivers
Project¶
- class Project(contributors, created_at, creator, id, url, name, org, updated_at, role, account, is_blocked, archived, feature_constraints, client_api: ApiClient, repositories=_Nothing.NOTHING)[source]¶
Bases:
BaseEntity
Project entity
- add_member(email, role: MemberRole = MemberRole.DEVELOPER)[source]¶
Add a member to the project.
- classmethod from_json(_json, client_api, is_fetched=True)[source]¶
Build a Project object from a json
- Parameters:
- Returns:
Project object
- Return type:
- list_members(role: Optional[MemberRole] = None)[source]¶
List the project members.
- Parameters:
role – The required role for the user. Use the enum dl.MemberRole
- Returns:
list of the project members
- Return type:
- to_json()[source]¶
Returns platform _json format of project object
- Returns:
platform json format of project object
- Return type:
- update(system_metadata=False)[source]¶
Update the project
- Parameters:
system_metadata (bool) – optional - True, if you want to change metadata system
- Returns:
Project object
- Return type:
- update_member(email, role: MemberRole = MemberRole.DEVELOPER)[source]¶
Update member’s information/details from the project.
User¶
- class User(created_at, updated_at, name, last_name, username, avatar, email, role, type, org, id, project, client_api=None, users=None)[source]¶
Bases:
BaseEntity
User entity
- classmethod from_json(_json, project, client_api, users=None)[source]¶
Build a User entity object from a json
- Parameters:
_json (dict) – _json response from host
project (dtlpy.entities.project.Project) – project entity
client_api – ApiClient entity
users – Users repository
- Returns:
User object
- Return type:
Dataset¶
- class Dataset(id, url, name, annotated, creator, projects, items_count, metadata, directoryTree, expiration_options, index_driver, enable_sync_with_cloned, created_at, updated_at, updated_by, items_url, readable_type, access_level, driver, src_dataset, readonly, annotations_count, client_api: ApiClient, project=None, datasets=None, repositories=_Nothing.NOTHING, ontology_ids=None, labels=None, directory_tree=None, recipe=None, ontology=None)[source]¶
Bases:
BaseEntity
Dataset object
- add_label(label_name, color=None, children=None, attributes=None, display_label=None, label=None, recipe_id=None, ontology_id=None, icon_path=None)[source]¶
Add single label to dataset
Prerequisites: You must have a dataset with items that are related to the annotations. The relationship between the dataset and annotations is shown in the name. You must be in the role of an owner or developer.
- Parameters:
label_name (str) – str - label name
color (tuple) – RGB color of the annotation, e.g (255,0,0) or ‘#ff0000’ for red
children – children (sub labels). list of sub labels of this current label, each value is either dict or dl.Label
attributes (list) – add attributes to the labels
display_label (str) – name that display label
label (dtlpy.entities.label.Label) – label object
recipe_id (str) – optional recipe id
ontology_id (str) – optional ontology id
icon_path (str) – path to image to be display on label
- Returns:
label entity
- Return type:
dtlpy.entities.label.Label
Example:
dataset.add_label(label_name='person', color=(34, 6, 231), attributes=['big', 'small'])
- add_labels(label_list, ontology_id=None, recipe_id=None)[source]¶
Add labels to dataset
Prerequisites: You must have a dataset with items that are related to the annotations. The relationship between the dataset and annotations is shown in the name. You must be in the role of an owner or developer.
- Parameters:
- Returns:
label entities
Example:
dataset.add_labels(label_list=label_list)
- clone(clone_name=None, filters=None, with_items_annotations=True, with_metadata=True, with_task_annotations_status=True, dst_dataset_id=None, target_directory=None)[source]¶
Clone dataset
Prerequisites: You must be in the role of an owner or developer.
- Parameters:
clone_name (str) – new dataset name
filters (dtlpy.entities.filters.Filters) – Filters entity or a query dict
with_items_annotations (bool) – clone all item’s annotations
with_metadata (bool) – clone metadata
with_task_annotations_status (bool) – clone task annotations status
dst_dataset_id (str) – destination dataset id
target_directory (str) – target directory
- Returns:
dataset object
- Return type:
Example:
dataset = dataset.clone(dataset_id='dataset_id', clone_name='dataset_clone_name', with_metadata=True, with_items_annotations=False, with_task_annotations_status=False)
- delete(sure=False, really=False)[source]¶
Delete a dataset forever!
Prerequisites: You must be an owner or developer to use this method.
- Parameters:
- Returns:
True is success
- Return type:
Example:
is_deleted = dataset.delete(sure=True, really=True)
- delete_attributes(keys: list, recipe_id: Optional[str] = None, ontology_id: Optional[str] = None)[source]¶
Delete a bulk of attributes
- delete_labels(label_names)[source]¶
Delete labels from dataset’s ontologies
Prerequisites: You must be in the role of an owner or developer.
- Parameters:
label_names – label object/ label name / list of label objects / list of label names
Example:
dataset.delete_labels(label_names=['myLabel1', 'Mylabel2'])
- download(filters=None, local_path=None, file_types=None, annotation_options: Optional[ViewAnnotationOptions] = None, annotation_filters=None, overwrite=False, to_items_folder=True, thickness=1, with_text=False, without_relative_path=None, alpha=1, export_version=ExportVersion.V1)[source]¶
Download dataset by filters. Filtering the dataset for items and save them local Optional - also download annotation, mask, instance and image mask of the item
Prerequisites: You must be in the role of an owner or developer.
- Parameters:
filters (dtlpy.entities.filters.Filters) – Filters entity or a dictionary containing filters parameters
local_path (str) – local folder or filename to save to.
file_types (list) – a list of file type to download. e.g [‘video/webm’, ‘video/mp4’, ‘image/jpeg’, ‘image/png’]
annotation_options (list) – type of download annotations: list(dl.ViewAnnotationOptions)
annotation_filters (dtlpy.entities.filters.Filters) – Filters entity to filter annotations for download
overwrite (bool) – optional - default = False to overwrite the existing files
to_items_folder (bool) – Create ‘items’ folder and download items to it
thickness (int) – optional - line thickness, if -1 annotation will be filled, default =1
with_text (bool) – optional - add text to annotations, default = False
without_relative_path (bool) – bool - download items without the relative path from platform
alpha (float) – opacity value [0 1], default 1
export_version (str) – V2 - exported items will have original extension in filename, V1 - no original extension in filenames
- Returns:
List of local_path per each downloaded item
Example:
dataset.download(local_path='local_path', annotation_options=[dl.ViewAnnotationOptions.JSON, dl.ViewAnnotationOptions.MASK], overwrite=False, thickness=1, with_text=False, alpha=1 )
- download_annotations(local_path=None, filters=None, annotation_options: Optional[ViewAnnotationOptions] = None, annotation_filters=None, overwrite=False, thickness=1, with_text=False, remote_path=None, include_annotations_in_output=True, export_png_files=False, filter_output_annotations=False, alpha=1, export_version=ExportVersion.V1)[source]¶
Download dataset by filters. Filtering the dataset for items and save them local Optional - also download annotation, mask, instance and image mask of the item
Prerequisites: You must be in the role of an owner or developer.
- Parameters:
local_path (str) – local folder or filename to save to.
filters (dtlpy.entities.filters.Filters) – Filters entity or a dictionary containing filters parameters
annotation_options (list(dtlpy.entities.annotation.ViewAnnotationOptions)) – download annotations options: list(dl.ViewAnnotationOptions)
annotation_filters (dtlpy.entities.filters.Filters) – Filters entity to filter annotations for download
overwrite (bool) – optional - default = False
thickness (int) – optional - line thickness, if -1 annotation will be filled, default =1
with_text (bool) – optional - add text to annotations, default = False
remote_path (str) – DEPRECATED and ignored
include_annotations_in_output (bool) – default - False , if export should contain annotations
export_png_files (bool) – default - if True, semantic annotations should be exported as png files
filter_output_annotations (bool) – default - False, given an export by filter - determine if to filter out annotations
alpha (float) – opacity value [0 1], default 1
export_version (str) – exported items will have original extension in filename, V1 - no original extension in filenames
- Returns:
local_path of the directory where all the downloaded item
- Return type:
Example:
local_path = dataset.download_annotations(dataset='dataset_entity', local_path='local_path', annotation_options=[dl.ViewAnnotationOptions.JSON, dl.ViewAnnotationOptions.MASK], overwrite=False, thickness=1, with_text=False, alpha=1 )
- download_folder(folder_path, filters=None, local_path=None, file_types=None, annotation_options: Optional[ViewAnnotationOptions] = None, annotation_filters=None, overwrite=False, to_items_folder=True, thickness=1, with_text=False, without_relative_path=None, alpha=1, export_version=ExportVersion.V1)[source]¶
Download dataset folder. Optional - also download annotation, mask, instance and image mask of the item
Prerequisites: You must be in the role of an owner or developer.
- Parameters:
folder_path (str) – the path of the folder that want to download
filters (dtlpy.entities.filters.Filters) – Filters entity or a dictionary containing filters parameters
local_path (str) – local folder or filename to save to.
file_types (list) – a list of file type to download. e.g [‘video/webm’, ‘video/mp4’, ‘image/jpeg’, ‘image/png’]
annotation_options (list) – type of download annotations: list(dl.ViewAnnotationOptions)
annotation_filters (dtlpy.entities.filters.Filters) – Filters entity to filter annotations for download
overwrite (bool) – optional - default = False to overwrite the existing files
to_items_folder (bool) – Create ‘items’ folder and download items to it
thickness (int) – optional - line thickness, if -1 annotation will be filled, default =1
with_text (bool) – optional - add text to annotations, default = False
without_relative_path (bool) – bool - download items without the relative path from platform
alpha (float) – opacity value [0 1], default 1
export_version (str) – V2 - exported items will have original extension in filename, V1 - no original extension in filenames
- Returns:
List of local_path per each downloaded item
Example:
dataset.download_folder(folder_path='folder_path' local_path='local_path', annotation_options=[dl.ViewAnnotationOptions.JSON, dl.ViewAnnotationOptions.MASK], overwrite=False, thickness=1, with_text=False, alpha=1, save_locally=True )
- export(local_path=None, filters=None, annotation_filters=None, feature_vector_filters=None, include_feature_vectors: bool = False, include_annotations: bool = False, export_type: ExportType = ExportType.JSON, timeout: int = 0)[source]¶
Export dataset items and annotations.
Prerequisites: You must be an owner or developer to use this method.
You must provide at least ONE of the following params: dataset, dataset_name, dataset_id.
- Parameters:
local_path (str) – The local path to save the exported dataset
filters (Union[dict, dtlpy.entities.filters.Filters]) – Filters entity or a query dictionary
annotation_filters (dtlpy.entities.filters.Filters) – Filters entity
feature_vector_filters (dtlpy.entities.filters.Filters) – Filters entity
include_feature_vectors (bool) – Include item feature vectors in the export
include_annotations (bool) – Include item annotations in the export
export_type (entities.ExportType) – Type of export (‘json’ or ‘zip’)
timeout (int) – Maximum time in seconds to wait for the export to complete
- Returns:
Exported item
- Return type:
Example:
export_item = dataset.export(filters=filters, include_feature_vectors=True, include_annotations=True, export_type=dl.ExportType.JSON)
- classmethod from_json(project: Project, _json: dict, client_api: ApiClient, datasets=None, is_fetched=True)[source]¶
Build a Dataset entity object from a json
- Parameters:
- Returns:
Dataset object
- Return type:
- static serialize_labels(labels_dict)[source]¶
Convert hex color format to rgb
- Parameters:
labels_dict (dict) – dict of labels
- Returns:
dict of converted labels
- set_readonly(state: bool)[source]¶
Set dataset readonly mode
Prerequisites: You must be in the role of an owner or developer.
- Parameters:
state (bool) – state
Example:
dataset.set_readonly(state=True)
- switch_recipe(recipe_id=None, recipe=None)[source]¶
Switch the recipe that linked to the dataset with the given one
- Parameters:
recipe_id (str) – recipe id
recipe (dtlpy.entities.recipe.Recipe) – recipe entity
Example:
dataset.switch_recipe(recipe_id='recipe_id')
- sync(wait=True)[source]¶
Sync dataset with external storage
Prerequisites: You must be in the role of an owner or developer.
Example:
success = dataset.sync()
- to_json()[source]¶
Returns platform _json format of object
- Returns:
platform json format of object
- Return type:
- update(system_metadata=False)[source]¶
Update dataset field
Prerequisites: You must be an owner or developer to use this method.
- Parameters:
system_metadata (bool) – bool - True, if you want to change metadata system
- Returns:
Dataset object
- Return type:
Example:
dataset = dataset.update()
- update_attributes(title: str, key: str, attribute_type, recipe_id: Optional[str] = None, ontology_id: Optional[str] = None, scope: Optional[list] = None, optional: Optional[bool] = None, values: Optional[list] = None, attribute_range=None)[source]¶
ADD a new attribute or update if exist
- Parameters:
ontology_id (str) – ontology_id
title (str) – attribute title
key (str) – the key of the attribute must br unique
attribute_type (AttributesTypes) – dl.AttributesTypes your attribute type
scope (list) – list of the labels or * for all labels
optional (bool) – optional attribute
values (list) – list of the attribute values ( for checkbox and radio button)
attribute_range (dict or AttributesRange) – dl.AttributesRange object
- Returns:
true in success
- Return type:
Example:
dataset.update_attributes(ontology_id='ontology_id', key='1', title='checkbox', attribute_type=dl.AttributesTypes.CHECKBOX, values=[1,2,3])
- update_label(label_name, color=None, children=None, attributes=None, display_label=None, label=None, recipe_id=None, ontology_id=None, upsert=False, icon_path=None)[source]¶
Add single label to dataset
Prerequisites: You must have a dataset with items that are related to the annotations. The relationship between the dataset and annotations is shown in the name. You must be in the role of an owner or developer.
- Parameters:
label_name (str) – str - label name
color (tuple) – color
children – children (sub labels)
attributes (list) – add attributes to the labels
display_label (str) – name that display label
label (dtlpy.entities.label.Label) – label
recipe_id (str) – optional recipe id
ontology_id (str) – optional ontology id
icon_path (str) – path to image to be display on label
- Returns:
label entity
- Return type:
dtlpy.entities.label.Label
Example:
dataset.update_label(label_name='person', color=(34, 6, 231), attributes=['big', 'small'])
- update_labels(label_list, ontology_id=None, recipe_id=None, upsert=False)[source]¶
Add labels to dataset
Prerequisites: You must have a dataset with items that are related to the annotations. The relationship between the dataset and annotations is shown in the name. You must be in the role of an owner or developer.
- Parameters:
- Returns:
label entities
- Return type:
dtlpy.entities.label.Label
Example:
dataset.update_labels(label_list=label_list)
- upload_annotations(local_path, filters=None, clean=False, remote_root_path='/', export_version=ExportVersion.V1)[source]¶
Upload annotations to dataset.
Prerequisites: You must have a dataset with items that are related to the annotations. The relationship between the dataset and annotations is shown in the name. You must be in the role of an owner or developer.
- Parameters:
local_path (str) – str - local folder where the annotations files is.
filters (dtlpy.entities.filters.Filters) – Filters entity or a dictionary containing filters parameters
clean (bool) – bool - if True it remove the old annotations
remote_root_path (str) – str - the remote root path to match remote and local items
export_version (str) – V2 - exported items will have original extension in filename, V1 - no original extension in filenames
For example, if the item filepath is a/b/item and remote_root_path is /a the start folder will be b instead of a
Example:
dataset.upload_annotations(dataset='dataset_entity', local_path='local_path', clean=False, export_version=dl.ExportVersion.V1 )
- class ExpirationOptions(item_max_days: Optional[int] = None)[source]¶
Bases:
object
ExpirationOptions object
Driver¶
- class AzureBlobDriver(creator, allow_external_delete, allow_external_modification, created_at, type, integration_id, integration_type, metadata, name, id, path, client_api: ApiClient, repositories=_Nothing.NOTHING, container_name=None)[source]¶
Bases:
Driver
- class Driver(creator, allow_external_delete, allow_external_modification, created_at, type, integration_id, integration_type, metadata, name, id, path, client_api: ApiClient, repositories=_Nothing.NOTHING)[source]¶
Bases:
BaseEntity
Driver entity
- delete(sure=False, really=False)[source]¶
Delete a driver forever!
Prerequisites: You must be an owner or developer to use this method.
- Parameters:
- Returns:
True if success
- Return type:
Example:
driver.delete(sure=True, really=True)
- class ExternalStorage(value)[source]¶
-
The type of the Integration.
State
Description
S3
AWS S3 drivers
GCS
Google GCS drivers
AZUREBLOB
Microsoft AZURE BLOB drivers
AZURE_DATALAKE_GEN2
Microsoft AZURE GEN2 drivers
- class GcsDriver(creator, allow_external_delete, allow_external_modification, created_at, type, integration_id, integration_type, metadata, name, id, path, client_api: ApiClient, repositories=_Nothing.NOTHING, bucket=None)[source]¶
Bases:
Driver
- class S3Driver(creator, allow_external_delete, allow_external_modification, created_at, type, integration_id, integration_type, metadata, name, id, path, client_api: ApiClient, repositories=_Nothing.NOTHING, bucket_name=None, region=None, storage_class=None)[source]¶
Bases:
Driver
Item¶
- class Item(annotations_link, dataset_url, thumbnail, created_at, updated_at, updated_by, dataset_id, annotated, metadata, filename, stream, name, type, url, id, hidden, dir, spec, creator, description, src_item, annotations_count, client_api: ApiClient, platform_dict, dataset, model, project, project_id, repositories=_Nothing.NOTHING)[source]¶
Bases:
BaseEntity
Item object
- clone(dst_dataset_id=None, remote_filepath=None, metadata=None, with_annotations=True, with_metadata=True, with_task_annotations_status=False, allow_many=False, wait=True)[source]¶
Clone item
- Parameters:
dst_dataset_id (str) – destination dataset id
remote_filepath (str) – complete filepath
metadata (dict) – new metadata to add
with_annotations (bool) – clone annotations
with_metadata (bool) – clone metadata
with_task_annotations_status (bool) – clone task annotations status
allow_many (bool) – bool if True, using multiple clones in single dataset is allowed, (default=False)
wait (bool) – wait for the command to finish
- Returns:
Item object
- Return type:
Example:
item.clone(item_id='item_id', dst_dataset_id='dist_dataset_id', with_metadata=True, with_task_annotations_status=False, with_annotations=False)
- download(local_path=None, file_types=None, save_locally=True, to_array=False, annotation_options: Optional[ViewAnnotationOptions] = None, overwrite=False, to_items_folder=True, thickness=1, with_text=False, annotation_filters=None, alpha=1, export_version=ExportVersion.V1)[source]¶
Download dataset by filters. Filtering the dataset for items and save them local Optional - also download annotation, mask, instance and image mask of the item
- Parameters:
local_path (str) – local folder or filename to save to.
file_types (list) – a list of file type to download. e.g [‘video/webm’, ‘video/mp4’, ‘image/jpeg’, ‘image/png’]
save_locally (bool) – bool. save to disk or return a buffer
to_array (bool) – returns Ndarray when True and local_path = False
annotation_options (list) – download annotations options: list(dl.ViewAnnotationOptions)
annotation_filters (dtlpy.entities.filters.Filters) – Filters entity to filter annotations for download
overwrite (bool) – optional - default = False
to_items_folder (bool) – Create ‘items’ folder and download items to it
thickness (int) – optional - line thickness, if -1 annotation will be filled, default =1
with_text (bool) – optional - add text to annotations, default = False
alpha (float) – opacity value [0 1], default 1
export_version (str) – exported items will have original extension in filename, V1 - no original extension in filenames
- Returns:
generator of local_path per each downloaded item
- Return type:
generator or single item
Example:
item.download(local_path='local_path', annotation_options=dl.ViewAnnotationOptions.MASK, overwrite=False, thickness=1, with_text=False, alpha=1, save_locally=True )
- classmethod from_json(_json, client_api, dataset=None, project=None, model=None, is_fetched=True)[source]¶
Build an item entity object from a json
- Parameters:
project (dtlpy.entities.project.Project) – project entity
_json (dict) – _json response from host
dataset (dtlpy.entities.dataset.Dataset) – dataset in which the annotation’s item is located
model (dtlpy.entities.dataset.Model) – the model entity if item is an artifact of a model
client_api (dlApiClient) – ApiClient entity
is_fetched (bool) – is Entity fetched from Platform
- Returns:
Item object
- Return type:
- move(new_path)[source]¶
Move item from one folder to another in Platform If the directory doesn’t exist it will be created
- set_description(text: str)[source]¶
Update Item description
- Parameters:
text (str) – if None or “” description will be deleted
:return
- status(assignment_id: Optional[str] = None, task_id: Optional[str] = None)[source]¶
Get item status
- Parameters:
- Returns:
status
- Return type:
Example:
status = item.status(task_id='task_id')
- to_json()[source]¶
Returns platform _json format of object
- Returns:
platform json format of object
- Return type:
- update(system_metadata=False)[source]¶
Update items metadata
- Parameters:
system_metadata (bool) – bool - True, if you want to change metadata system
- Returns:
Item object
- Return type:
Item Link¶
Annotation¶
- class Annotation(id, url, item_url, item, item_id, creator, created_at, updated_by, updated_at, type, source, dataset_url, platform_dict, metadata, fps, hash=None, dataset_id=None, status=None, object_id=None, automated=None, item_height=None, item_width=None, label_suggestions=None, annotation_definition: Optional[BaseAnnotationDefinition] = None, frames=None, current_frame=0, end_frame=0, end_time=0, start_frame=0, start_time=0, dataset=None, datasets=None, annotations=None, Annotation__client_api=None, items=None, recipe_2_attributes=None)[source]¶
Bases:
BaseEntity
Annotations object
- add_frame(annotation_definition, frame_num=None, fixed=True, object_visible=True)[source]¶
Add a frame state to annotation
- Parameters:
- Returns:
True if success
- Return type:
Example:
success = annotation.add_frame(frame_num=10, annotation_definition=dl.Box(top=10,left=10,bottom=100, right=100,label='labelName')) )
- add_frames(annotation_definition, frame_num=None, end_frame_num=None, start_time=None, end_time=None, fixed=True, object_visible=True)[source]¶
Add a frames state to annotation
Prerequisites: Any user can upload annotations.
- Parameters:
annotation_definition – annotation type object - must be same type as annotation
frame_num (int) – first frame number
end_frame_num (int) – last frame number
start_time – starting time for video
end_time – ending time for video
fixed (bool) – is fixed
object_visible (bool) – does the annotated object is visible
- Returns:
Example:
annotation.add_frames(frame_num=10, annotation_definition=dl.Box(top=10,left=10,bottom=100, right=100,label='labelName')) )
- delete()[source]¶
Remove an annotation from item
Prerequisites: You must have an item that has been annotated. You must have the role of an owner or developer or be assigned a task that includes that item as an annotation manager or annotator.
- Returns:
True if success
- Return type:
Example:
is_deleted = annotation.delete()
- download(filepath: str, annotation_format: ViewAnnotationOptions = ViewAnnotationOptions.JSON, height: Optional[float] = None, width: Optional[float] = None, thickness: int = 1, with_text: bool = False, alpha: float = 1)[source]¶
Save annotation to file
Prerequisites: Any user can upload annotations.
- Parameters:
filepath (str) – local path to where annotation will be downloaded to
annotation_format (list) – options: list(dl.ViewAnnotationOptions)
height (float) – image height
width (float) – image width
thickness (int) – line thickness
with_text (bool) – get mask with text
alpha (float) – opacity value [0 1], default 1
- Returns:
filepath
- Return type:
Example:
filepath = annotation.download(filepath='filepath', annotation_format=dl.ViewAnnotationOptions.MASK)
- classmethod from_json(_json, item=None, client_api=None, annotations=None, is_video=None, fps=None, item_metadata=None, dataset=None, is_audio=None)[source]¶
Create an annotation object from platform json
- Parameters:
_json (dict) – platform json
item (dtlpy.entities.item.Item) – item
client_api – ApiClient entity
annotations –
is_video (bool) – is video
fps – video fps
item_metadata – item metadata
dataset – dataset entity
is_audio (bool) – is audio
- Returns:
annotation object
- Return type:
- classmethod new(item=None, annotation_definition=None, object_id=None, automated=True, metadata=None, frame_num=None, parent_id=None, start_time=None, item_height=None, item_width=None, end_time=None)[source]¶
Create a new annotation object annotations
Prerequisites: Any user can upload annotations.
- Parameters:
item (dtlpy.entities.item.Items) – item to annotate
annotation_definition – annotation type object
object_id (str) – object_id
automated (bool) – is automated
metadata (dict) – metadata
frame_num (int) – optional - first frame number if video annotation
parent_id (str) – add parent annotation ID
start_time – optional - start time if video annotation
item_height (float) – annotation item’s height
item_width (float) – annotation item’s width
end_time – optional - end time if video annotation
- Returns:
annotation object
- Return type:
Example:
annotation = annotation.new(item='item_entity, annotation_definition=dl.Box(top=10,left=10,bottom=100, right=100,label='labelName')) )
- set_frame(frame)[source]¶
Set annotation to frame state
Prerequisites: Any user can upload annotations.
Example:
success = annotation.set_frame(frame=10)
- show(image=None, thickness=None, with_text=False, height=None, width=None, annotation_format: ViewAnnotationOptions = ViewAnnotationOptions.MASK, color=None, label_instance_dict=None, alpha=1, frame_num=None)[source]¶
Show annotations mark the annotation of the image array and return it
Prerequisites: Any user can upload annotations.
- Parameters:
image – empty or image to draw on
thickness (int) – line thickness
with_text (bool) – add label to annotation
height (float) – height
width (float) – width
annotation_format (dl.ViewAnnotationOptions) – list(dl.ViewAnnotationOptions)
color (tuple) – optional - color tuple
label_instance_dict – the instance labels
alpha (float) – opacity value [0 1], default 1
frame_num (int) – for video annotation, show specific fame
- Returns:
list or single ndarray of the annotations
Exampls:
image = annotation.show(image='ndarray', thickness=1, annotation_format=dl.VIEW_ANNOTATION_OPTIONS_MASK, )
- to_json()[source]¶
Convert annotation object to a platform json representatio
- Returns:
platform json
- Return type:
- update(system_metadata=False)[source]¶
Update an existing annotation in host.
Prerequisites: You must have an item that has been annotated. You must have the role of an owner or developer or be assigned a task that includes that item as an annotation manager or annotator.
- Parameters:
system_metadata – True, if you want to change metadata system
- Returns:
Annotation object
- Return type:
Example:
annotation = annotation.update()
- update_status(status: AnnotationStatus = AnnotationStatus.ISSUE)[source]¶
Set status on annotation
Prerequisites: You must have an item that has been annotated. You must have the role of an owner or developer or be assigned a task that includes that item as an annotation manager.
- Parameters:
status (str) – can be AnnotationStatus.ISSUE, AnnotationStatus.APPROVED, AnnotationStatus.REVIEW, AnnotationStatus.CLEAR
- Returns:
Annotation object
- Return type:
Example:
annotation = annotation.update_status(status=dl.AnnotationStatus.ISSUE)
- class FrameAnnotation(annotation, annotation_definition, frame_num, fixed, object_visible, recipe_2_attributes=None, interpolation=False)[source]¶
Bases:
BaseEntity
FrameAnnotation object
- classmethod from_snapshot(annotation, _json, fps)[source]¶
new frame state to annotation
- Parameters:
annotation – annotation
_json – annotation type object - must be same type as annotation
fps – frame number
- Returns:
FrameAnnotation object
- classmethod new(annotation, annotation_definition, frame_num, fixed, object_visible=True)[source]¶
new frame state to annotation
- Parameters:
annotation – annotation
annotation_definition – annotation type object - must be same type as annotation
frame_num – frame number
fixed – is fixed
object_visible – does the annotated object is visible
- Returns:
FrameAnnotation object
- class ViewAnnotationOptions(value)[source]¶
-
The Annotations file types to download (JSON, MASK, INSTANCE, ANNOTATION_ON_IMAGE, VTT, OBJECT_ID).
State
Description
JSON
Dataloop json format
MASK
PNG file that contains drawing annotations on it
INSTANCE
An image file that contains 2D annotations
ANNOTATION_ON_IMAGE
The source image with the annotations drawing in it
VTT
An text file contains supplementary information about a web video
OBJECT_ID
An image file that contains 2D annotations
Collection of Annotation entities¶
- class AnnotationCollection(item=None, annotations=_Nothing.NOTHING, dataset=None, colors=None)[source]¶
Bases:
BaseEntity
Collection of Annotation entity
- add(annotation_definition, object_id=None, frame_num=None, end_frame_num=None, start_time=None, end_time=None, automated=True, fixed=True, object_visible=True, metadata=None, parent_id=None, prompt_id=None, model_info=None)[source]¶
Add annotations to collection
- Parameters:
annotation_definition – dl.Polygon, dl.Segmentation, dl.Point, dl.Box etc.
object_id – Object id (any id given by user). If video - must input to match annotations between frames
frame_num – video only, number of frame
end_frame_num – video only, the end frame of the annotation
start_time – video only, start time of the annotation
end_time – video only, end time of the annotation
automated –
fixed – video only, mark frame as fixed
object_visible – video only, does the annotated object is visible
metadata – optional, metadata dictionary for annotation
parent_id – set a parent for this annotation (parent annotation ID)
prompt_id – Connect the annotation with a specific prompt in a dl.PromptItem
model_info – optional - set model on annotation {‘confidence’: 0, # [Mandatory], (Float between 0-1) ‘name’: ‘’, # [Optional], (‘name’ refers to ‘model_name’) ‘model_id’: ‘’} # [Optional]
- Returns:
- delete()[source]¶
Remove an annotation from item
Prerequisites: You must have an item that has been annotated. You must have the role of an owner or developer or be assigned a task that includes that item as an annotation manager or annotator.
- Returns:
True if success
- Return type:
Example:
is_deleted = builder.delete()
- download(filepath, img_filepath=None, annotation_format: ViewAnnotationOptions = ViewAnnotationOptions.JSON, height=None, width=None, thickness=1, with_text=False, orientation=0, alpha=1)[source]¶
Save annotations to file
Prerequisites: Any user can upload annotations.
- Parameters:
filepath (str) – path to save annotation
img_filepath (str) – img file path - needed for img_mask
annotation_format (dl.ViewAnnotationOptions) – how to show thw annotations. options: list(dl.ViewAnnotationOptions)
height (int) – height
width (int) – width
thickness (int) – thickness
with_text (bool) – add a text to an image
orientation (int) – the image orientation
alpha (float) – opacity value [0 1], default 1
- Returns:
file path of the download annotation
- Return type:
Example:
filepath = builder.download(filepath='filepath', annotation_format=dl.ViewAnnotationOptions.MASK)
- from_instance_mask(mask, instance_map=None)[source]¶
convert annotation from instance mask format
- Parameters:
mask – the mask annotation
instance_map – labels
- classmethod from_json(_json: list, item=None, is_video=None, fps=25, height=None, width=None, client_api=None, is_audio=None) AnnotationCollection [source]¶
Create an annotation collection object from platform json
- Parameters:
- Returns:
annotation object
- Return type:
- from_vtt_file(filepath)[source]¶
convert annotation from vtt format
- Parameters:
filepath (str) – path to the file
- get_frame(frame_num)[source]¶
Get frame
- Parameters:
frame_num (int) – frame num
- Returns:
AnnotationCollection
- show(image=None, thickness=None, with_text=False, height=None, width=None, annotation_format: ViewAnnotationOptions = ViewAnnotationOptions.MASK, label_instance_dict=None, color=None, alpha=1.0, frame_num=None)[source]¶
Show annotations according to annotation_format
Prerequisites: Any user can upload annotations.
- Parameters:
image (ndarray) – empty or image to draw on
height (int) – height
width (int) – width
thickness (int) – line thickness
with_text (bool) – add label to annotation
annotation_format (dl.ViewAnnotationOptions) – how to show thw annotations. options: list(dl.ViewAnnotationOptions)
label_instance_dict (dict) – instance label map {‘Label’: 1, ‘More’: 2}
color (tuple) – optional - color tuple
alpha (float) – opacity value [0 1], default 1
frame_num (int) – for video annotation, show specific frame
- Returns:
ndarray of the annotations
Example:
image = builder.show(image='ndarray', thickness=1, annotation_format=dl.VIEW_ANNOTATION_OPTIONS_MASK)
- to_json()[source]¶
Convert annotation object to a platform json representation
- Returns:
platform json
- Return type:
- update(system_metadata=True)[source]¶
Update an existing annotation in host.
Prerequisites: You must have an item that has been annotated. You must have the role of an owner or developer or be assigned a task that includes that item as an annotation manager or annotator.
- Parameters:
system_metadata – True, if you want to change metadata system
- Returns:
Annotation object
- Return type:
Example:
annotation = builder.update()
Annotation Definition¶
Box Annotation Definition¶
- class Box(left=None, top=None, right=None, bottom=None, label=None, attributes=None, description=None, angle=None)[source]¶
Bases:
BaseAnnotationDefinition
Box annotation object Can create a box using 2 point using: “top”, “left”, “bottom”, “right” (to form a box [(left, top), (right, bottom)]) For rotated box add the “angel”
- classmethod from_segmentation(mask, label, attributes=None)[source]¶
Convert binary mask to Polygon
- Parameters:
mask – binary mask (0,1)
label – annotation label
attributes – annotations list of attributes
- Returns:
Box annotations list to each separated segmentation
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Classification Annotation Definition¶
- class Classification(label, attributes=None, description=None)[source]¶
Bases:
BaseAnnotationDefinition
Classification annotation object
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Cuboid Annotation Definition¶
- class Cube(label, front_tl, front_tr, front_br, front_bl, back_tl, back_tr, back_br, back_bl, angle=None, attributes=None, description=None)[source]¶
Bases:
BaseAnnotationDefinition
Cube annotation object
- classmethod from_boxes_and_angle(front_left, front_top, front_right, front_bottom, back_left, back_top, back_right, back_bottom, label, angle=0, attributes=None)[source]¶
Create cuboid by given front and back boxes with angle the angle calculate fom the center of each box
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Item Description Definition¶
Ellipse Annotation Definition¶
- class Ellipse(x, y, rx, ry, angle, label, attributes=None, description=None)[source]¶
Bases:
BaseAnnotationDefinition
Ellipse annotation object
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Note Annotation Definition¶
Point Annotation Definition¶
- class Point(x, y, label, attributes=None, description=None)[source]¶
Bases:
BaseAnnotationDefinition
Point annotation object
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Polygon Annotation Definition¶
- class Polygon(geo, label, attributes=None, description=None)[source]¶
Bases:
BaseAnnotationDefinition
Polygon annotation object
- classmethod from_segmentation(mask, label, attributes=None, epsilon=None, max_instances=1, min_area=0)[source]¶
Convert binary mask to Polygon
- Parameters:
mask – binary mask (0,1)
label – annotation label
attributes – annotations list of attributes
epsilon – from opencv: specifying the approximation accuracy. This is the maximum distance between the original curve and its approximation. if 0 all points are returns
max_instances – number of max instances to return. if None all wil be returned
min_area – remove polygons with area lower thn this threshold (pixels)
- Returns:
Polygon annotation
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Polyline Annotation Definition¶
- class Polyline(geo, label, attributes=None, description=None)[source]¶
Bases:
BaseAnnotationDefinition
Polyline annotation object
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Pose Annotation Definition¶
- class Pose(label, template_id, instance_id=None, attributes=None, points=None, description=None)[source]¶
Bases:
BaseAnnotationDefinition
Classification annotation object
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Segmentation Annotation Definition¶
- class Segmentation(geo, label, attributes=None, description=None, color=None)[source]¶
Bases:
BaseAnnotationDefinition
Segmentation annotation object
- classmethod from_polygon(geo, label, shape, attributes=None)[source]¶
- Parameters:
geo – list of x,y coordinates of the polygon ([[x,y],[x,y]…]
label – annotation’s label
shape – image shape (h,w)
attributes –
- Returns:
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Audio Annotation Definition¶
Undefined Annotation Definition¶
- class UndefinedAnnotationType(type, label, coordinates, attributes=None, description=None)[source]¶
Bases:
BaseAnnotationDefinition
UndefinedAnnotationType annotation object
- show(image, thickness, with_text, height, width, annotation_format, color, alpha=1)[source]¶
Show annotation as ndarray :param image: empty or image to draw on :param thickness: :param with_text: not required :param height: item height :param width: item width :param annotation_format: options: list(dl.ViewAnnotationOptions) :param color: color :param alpha: opacity value [0 1], default 1 :return: ndarray
Similarity¶
Filter¶
- class Filters(field=None, values=None, operator: Optional[FiltersOperations] = None, method: Optional[FiltersMethod] = None, custom_filter=None, resource: FiltersResource = FiltersResource.ITEM, use_defaults=True, context=None, page_size=None)[source]¶
Bases:
object
Filters entity to filter items from pages in platform
- add(field, values, operator: Optional[FiltersOperations] = None, method: Optional[FiltersMethod] = None)[source]¶
Add filter
- Parameters:
field (str) – Metadata field / attribute
values – field values
operator (dl.FiltersOperations) – optional - in, gt, lt, eq, ne
method (dl.FiltersMethod) – Optional - or/and
Example:
filter.add(field='metadata.user', values=['1','2'], operator=dl.FiltersOperations.IN)
- add_join(field, values, operator: Optional[FiltersOperations] = None, method: FiltersMethod = FiltersMethod.AND)[source]¶
join a query to the filter
- Parameters:
Example:
filter.add_join(field='metadata.user', values=['1','2'], operator=dl.FiltersOperations.IN)
- static list(project: Project) list [source]¶
List all saved filters for a project :param project: dl.Project entity :return: a list of all the saved filters’ names
- classmethod load(project: Project, filter_name: str) Filters [source]¶
Load a saved filter from the project by name
- Parameters:
project – dl.Project entity
filter_name – filter name
- Returns:
dl.Filters
- open_in_web(resource)[source]¶
Open the filter in the platform data browser (in a new web browser)
- Parameters:
resource (str) – dl entity to apply filter on. currently only supports dl.Dataset
- prepare(operation=None, update=None, query_only=False, system_update=None, system_metadata=False)[source]¶
To dictionary for platform call
- save(project: Project, filter_name: str)[source]¶
Save the current DQL filter to the project
- Parameters:
project – dl.Project
filter_name – the saved filter’s name
- Returns:
True if success
- sort_by(field, value: FiltersOrderByDirection = FiltersOrderByDirection.ASCENDING)[source]¶
sort the filter
- Parameters:
field (str) – field to sort by it
value (dl.FiltersOrderByDirection) – FiltersOrderByDirection.ASCENDING, FiltersOrderByDirection.DESCENDING
Example:
filter.sort_by(field='metadata.user', values=dl.FiltersOrderByDirection.ASCENDING)
Recipe¶
- class Recipe(id, creator, url, title, project_ids, description, ontology_ids, instructions, examples, custom_actions, metadata, created_at, updated_at, updated_by, ui_settings, client_api: ApiClient, dataset=None, project=None, repositories=_Nothing.NOTHING)[source]¶
Bases:
BaseEntity
Recipe object
- add_instruction(annotation_instruction_file)[source]¶
Add instruction to recipe
- Parameters:
annotation_instruction_file (str) – file path or url of the recipe instruction
- clone(shallow=False)[source]¶
Clone Recipe
- Parameters:
shallow (bool) – If True, link ot existing ontology, clones all ontology that are link to the recipe as well
- Returns:
Cloned ontology object
- Return type:
- classmethod from_json(_json, client_api, dataset=None, project=None, is_fetched=True)[source]¶
Build a Recipe entity object from a json
- Parameters:
_json (dict) – _json response from host
Dataset (dtlpy.entities.dataset.Dataset) – Dataset entity
project (dtlpy.entities.project.Project) – project entity
client_api (dl.ApiClient) – ApiClient entity
is_fetched (bool) – is Entity fetched from Platform
- Returns:
Recipe object
- get_annotation_template_id(template_name)[source]¶
Get annotation template id by template name
- Parameters:
template_name (str) –
- Returns:
template id or None if does not exist
- to_json()[source]¶
Returns platform _json format of object
- Returns:
platform json format of object
- Return type:
- update(system_metadata=False)[source]¶
Update Recipe
- Parameters:
system_metadata (bool) – bool - True, if you want to change metadata system
- Returns:
Recipe object
- Return type:
Ontology¶
- class Ontology(client_api: ApiClient, id, creator, url, title, labels, metadata, attributes, recipe=None, dataset=None, project=None, repositories=_Nothing.NOTHING, instance_map=None, color_map=None)[source]¶
Bases:
BaseEntity
Ontology object
- add_label(label_name, color=None, children=None, attributes=None, display_label=None, label=None, add=True, icon_path=None, update_ontology=False)[source]¶
Add a single label to ontology
- Parameters:
label_name (str) – str - label name
color (tuple) – color
children – children (sub labels)
attributes (list) – attributes
display_label (str) – display_label
label (dtlpy.entities.label.Label) – label
add (bool) – to add or not
icon_path (str) – path to image to be display on label
update_ontology (bool) – update the ontology, default = False for backward compatible
- Returns:
Label entity
- Return type:
dtlpy.entities.label.Label
Example:
label = ontology.add_label(label_name='person', color=(34, 6, 231), attributes=['big', 'small'])
- add_labels(label_list, update_ontology=False)[source]¶
Adds a list of labels to ontology
- Parameters:
- Returns:
List of label entities added
Example:
labels = ontology.add_labels(label_list=label_list)
- copy_from(ontology_json: dict)[source]¶
Import ontology to the platform.
Notice: only the following fields will be updated: labels, attributes, instance_map and color_map.
- Parameters:
ontology_json (dict) – The source ontology json to copy from
- Returns:
Ontology object: The updated ontology entity
- Return type:
Example:
ontology = ontology.import_ontology(ontology_json=ontology_json)
- delete_attributes(keys: list)[source]¶
Delete a bulk of attributes
Example:
success = ontology.delete_attributes(['1'])
- delete_labels(label_names)[source]¶
Delete labels from ontology
- Parameters:
label_names – label object/ label name / list of label objects / list of label names
- Returns:
- classmethod from_json(_json, client_api, recipe, dataset=None, project=None, is_fetched=True)[source]¶
Build an Ontology entity object from a json
- Parameters:
is_fetched (bool) – is Entity fetched from Platform
project (dtlpy.entities.project.Project) – project entity
dataset (dtlpy.entities.dataset.Dataset) – dataset
_json (dict) – _json response from host
recipe (dtlpy.entities.recipe.Recipe) – ontology’s recipe
client_api (dl.ApiClient) – ApiClient entity
- Returns:
Ontology object
- Return type:
- property instance_map¶
instance mapping for creating instance mask
- Return dictionary {label:
map_id}
- Return type:
- to_json()[source]¶
Returns platform _json format of object
- Returns:
platform json format of object
- Return type:
- update(system_metadata=False)[source]¶
Update items metadata
- Parameters:
system_metadata (bool) – bool - True, if you want to change metadata system
- Returns:
Ontology object
- update_attributes(title: str, key: str, attribute_type, scope: Optional[list] = None, optional: Optional[bool] = None, values: Optional[list] = None, attribute_range=None)[source]¶
ADD a new attribute or update if exist
- Parameters:
title (str) – attribute title
key (str) – the key of the attribute must br unique
attribute_type (AttributesTypes) – dl.AttributesTypes your attribute type
scope (list) – list of the labels or * for all labels
optional (bool) – optional attribute
values (list) – list of the attribute values ( for checkbox and radio button)
attribute_range (dict or AttributesRange) – dl.AttributesRange object
- Returns:
true in success
- Return type:
- update_label(label_name, color=None, children=None, attributes=None, display_label=None, label=None, add=True, icon_path=None, upsert=False, update_ontology=False)[source]¶
Update a single label to ontology
- Parameters:
label_name (str) – str - label name
color (tuple) – color
children – children (sub labels)
attributes (list) – attributes
display_label (str) – display_label
label (dtlpy.entities.label.Label) – label
add (bool) – to add or not
icon_path (str) – path to image to be display on label
update_ontology (bool) – update the ontology, default = False for backward compatible
upsert (bool) – if True will add in case it does not existing
- Returns:
Label entity
- Return type:
dtlpy.entities.label.Label
Example:
label = ontology.update_label(label_name='person', color=(34, 6, 231), attributes=['big', 'small'])
- update_labels(label_list, upsert=False, update_ontology=False)[source]¶
Update a list of labels to ontology
- Parameters:
label_list (list) – list of labels [{“value”: {“tag”: “tag”, “displayLabel”: “displayLabel”, “color”: “#color”, “attributes”: [attributes]}, “children”: [children]}]
upsert (bool) – if True will add in case it does not existing
update_ontology (bool) – update the ontology, default = False for backward compatible
- Returns:
List of label entities added
Example:
labels = ontology.update_labels(label_list=label_list)
Label¶
Task¶
- class Task(name, status, project_id, metadata, id, url, task_owner, item_status, creator, due_date, dataset_id, spec, recipe_id, query, assignmentIds, annotation_status, progress, for_review, issues, updated_at, created_at, available_actions, total_items, priority, description, client_api, current_assignments=None, assignments=None, project=None, dataset=None, tasks=None, settings=None)[source]¶
Bases:
object
Task object
- add_items(filters=None, items=None, assignee_ids=None, workload=None, limit=None, wait=True, query=None)[source]¶
Add items to Task
- Parameters:
filters (dtlpy.entities.filters.Filters) – Filters entity or a dictionary containing filters parameters
items (list) – list of items (item Ids or objects) to add to the task
assignee_ids (list) – list to assignee who works in the task
workload (list) – list of WorkloadUnit objects. Customize distribution (percentage) between the task assignees. For example: [dl.WorkloadUnit(annotator@hi.com, 80), dl.WorkloadUnit(annotator2@hi.com, 20)]
limit (int) – the limit items that task can include
wait (bool) – wait until add items will to finish
query (dict) – query to filter the items for the task
- Returns:
task entity
- Return type:
- create_assignment(assignment_name, assignee_id, items=None, filters=None)[source]¶
Create a new assignment
- Parameters:
assignment_name (str) – assignment name
assignee_id (str) – the assignment assignees (contributors) that should be working on the task. Provide a user email
items (List[entities.Item]) – list of items (item Id or objects) to insert to the task
filters (dtlpy.entities.filters.Filters) – Filters entity or a dictionary containing filters parameters
- Returns:
Assignment object
- Return type:
dtlpy.entities.assignment.Assignment assignment
Example:
assignment = task.create_assignment(assignee_id='annotator1@dataloop.ai')
- create_qa_task(due_date, assignee_ids, filters=None, items=None, query=None, workload=None, metadata=None, available_actions=None, wait=True, batch_size=None, max_batch_workload=None, allowed_assignees=None, priority=TaskPriority.MEDIUM)[source]¶
Create a new QA Task
- Parameters:
due_date (float) – date by which the QA task should be finished; for example, due_date=datetime.datetime(day=1, month=1, year=2029).timestamp()
assignee_ids (list) – list the QA task assignees (contributors) that should be working on the task. Provide a list of users’ emails
filters (entities.Filters) – dl.Filters entity to filter items for the task
items (List[entities.Item]) – list of items (item Id or objects) to insert to the task
query (dict DQL) – filter items for the task
workload (List[WorkloadUnit]) – list of WorkloadUnit objects. Customize distribution (percentage) between the task assignees. For example: [dl.WorkloadUnit(annotator@hi.com, 80), dl.WorkloadUnit(annotator2@hi.com, 20)]
metadata (dict) – metadata for the task
available_actions (list) – list of available actions (statuses) that will be available for the task items; The default statuses are: “approved” and “discard”
wait (bool) – wait until create task finish
batch_size (int) – Pulling batch size (items), use with pulling allocation method. Restrictions - Min 3, max 100
max_batch_workload (int) – Max items in assignment, use with pulling allocation method. Restrictions - Min batchSize + 2, max batchSize * 2
allowed_assignees (list) – list the task assignees (contributors) that should be working on the task. Provide a list of users’ emails
priority (entities.TaskPriority) – priority of the task options in entities.TaskPriority
- Returns:
task object
- Return type:
Example:
task = task.create_qa_task(due_date = datetime.datetime(day= 1, month= 1, year= 2029).timestamp(), assignee_ids =[ 'annotator1@dataloop.ai', 'annotator2@dataloop.ai'])
- classmethod from_json(_json, client_api, project=None, dataset=None)[source]¶
Return the task object form the json
- Parameters:
_json (dict) – platform json that describe the task
client_api – ApiClient object
project (dtlpy.entities.project.Project) – project object where task will create
dataset (dtlpy.entities.dataset.Dataset) – dataset object that refer to the task
- Returns:
- get_items(filters=None, get_consensus_items: bool = False)[source]¶
Get the task items
- Parameters:
filters (dtlpy.entities.filters.Filters) – Filters entity or a dictionary containing filters parameters
- Returns:
list of the items or PagedEntity output of items
- Return type:
- remove_items(filters: Optional[Filters] = None, query=None, items=None, wait=True)[source]¶
remove items from Task.
Prerequisites: You must be in the role of an owner, developer, or annotation manager who has been assigned to be owner of the annotation task.
- Parameters:
filters (dtlpy.entities.filters.Filters) – Filters entity or a dictionary containing filters parameters
query (dict) – query to filter the items use it
items (list) – list of items to add to the task
wait (bool) – wait until remove items finish
- Returns:
True if success and an error if failed
- Return type:
- set_status(status: str, operation: str, item_ids: List[str])[source]¶
Update item status within task
Assignment¶
- class Assignment(name, annotator, status, project_id, metadata, id, url, updated_at, updated_by, task_id, dataset_id, annotation_status, item_status, total_items, for_review, issues, client_api, task=None, assignments=None, project=None, dataset=None, datasets=None)[source]¶
Bases:
BaseEntity
Assignment object
- get_items(dataset=None, filters=None)[source]¶
Get all the items in the assignment
Prerequisites: You must be in the role of an owner, developer, or annotation manager who has been assigned as owner of the annotation task.
- Parameters:
dataset (dtlpy.entities.dataset.Dataset) – dataset object, the dataset that refer to the assignment
filters (dtlpy.entities.filters.Filters) – Filters entity or a dictionary containing filters parameters
- Returns:
pages of the items
- Return type:
Example:
items = task.assignments.get_items()
- open_in_web()[source]¶
Open the assignment in web platform
Prerequisites: You must be in the role of an owner, developer, or annotation manager who has been assigned as owner of the annotation task.
- Returns:
Example:
assignment.open_in_web()
- reassign(assignee_id, wait=True)[source]¶
Reassign an assignment
Prerequisites: You must be in the role of an owner, developer, or annotation manager who has been assigned as owner of the annotation task.
- Parameters:
- Returns:
Assignment object
- Return type:
Example:
assignment = assignment.reassign(assignee_ids='annotator1@dataloop.ai')
- redistribute(workload, wait=True)[source]¶
Redistribute an assignment
Prerequisites: You must be in the role of an owner, developer, or annotation manager who has been assigned as owner of the annotation task.
- Parameters:
workload (dtlpy.entities.assignment.Workload) – list of WorkloadUnit objects. Customize distribution (percentage) between the task assignees. For example: [dl.WorkloadUnit(annotator@hi.com, 80), dl.WorkloadUnit(annotator2@hi.com, 20)]
wait (bool) – wait until redistribute assignment finish
- Returns:
Assignment object
- Return type:
dtlpy.entities.assignment.Assignment assignment
Example:
assignment = assignment.redistribute(workload=dl.Workload([dl.WorkloadUnit(assignee_id="annotator1@dataloop.ai", load=50), dl.WorkloadUnit(assignee_id="annotator2@dataloop.ai", load=50)]))
- set_status(status: str, operation: str, item_id: str)[source]¶
Set item status within assignment
Prerequisites: You must be in the role of an owner, developer, or annotation manager who has been assigned as owner of the annotation task.
- Parameters:
- Returns:
True id success
- Return type:
Example:
success = assignment.set_status(status='complete', operation='created', item_id='item_id')
- to_json()[source]¶
Returns platform _json format of object
- Returns:
platform json format of object
- Return type:
- update(system_metadata=False)[source]¶
Update an assignment
Prerequisites: You must be in the role of an owner, developer, or annotation manager who has been assigned as owner of the annotation task.
- Parameters:
system_metadata (bool) – True, if you want to change metadata system
- Returns:
Assignment object
- Return type:
dtlpy.entities.assignment.Assignment assignment
Example:
assignment = assignment.update(system_metadata=False)
Package¶
- class Package(_dict=None, **kwargs)[source]¶
Bases:
DlEntity
Package object
- build(module_name=None, init_inputs=None, local_path=None, from_local=None)[source]¶
Instantiate a module from the package code. Returns a loaded instance of the runner class
- Parameters:
module_name – Name of the module to build the runner class
init_inputs (str) – dictionary of the class init variables (if exists). will be used to init the module class
local_path (str) – local path of the package (if from_local=False - codebase will be downloaded)
from_local (bool) – bool. if true - codebase will not be downloaded (only use local files)
- Returns:
dl.BaseServiceRunner
- deploy(service_name=None, revision=None, init_input=None, runtime=None, sdk_version=None, agent_versions=None, verify=True, bot=None, pod_type=None, module_name=None, run_execution_as_process=None, execution_timeout=None, drain_time=None, on_reset=None, max_attempts=None, force=False, secrets: Optional[list] = None, **kwargs)[source]¶
Deploy package
- Parameters:
service_name (str) – service name
revision (str) – package revision - default=latest
init_input – config to run at startup
runtime (dict) – runtime resources
sdk_version (str) –
optional - string - sdk version
agent_versions (dict) –
dictionary - - optional -versions of sdk, agent runner and agent proxy
bot (str) – bot email
pod_type (str) – pod type dl.InstanceCatalog
verify (bool) – verify the inputs
module_name (str) – module name
run_execution_as_process (bool) – run execution as process
execution_timeout (int) – execution timeout
drain_time (int) – drain time
on_reset (str) – on reset
max_attempts (int) – Maximum execution retries in-case of a service reset
force (bool) – optional - terminate old replicas immediately
secrets (list) – list of the integrations ids
- Returns:
Service object
- Return type:
Example:
- service: dl.Service = package.deploy(service_name=package_name,
execution_timeout=3 * 60 * 60, module_name=module.name, runtime=dl.KubernetesRuntime(
concurrency=10, pod_type=dl.InstanceCatalog.REGULAR_S, autoscaler=dl.KubernetesRabbitmqAutoscaler(
min_replicas=1, max_replicas=20, queue_length=20
)
- classmethod from_json(_json, client_api, project, is_fetched=True)[source]¶
Turn platform representation of package into a package entity
- Parameters:
_json (dict) – platform representation of package
client_api (dl.ApiClient) – ApiClient entity
project (dtlpy.entities.project.Project) – project entity
is_fetched – is Entity fetched from Platform
- Returns:
Package entity
- Return type:
- static get_ml_metadata(cls=None, available_methods=None, output_type=AnnotationType.CLASSIFICATION, input_type='image', default_configuration: Optional[dict] = None)[source]¶
Create ML metadata for the package :param cls: ModelAdapter class, to get the list of available_methods :param available_methods: available user function on the adapter. [‘load’, ‘save’, ‘predict’, ‘train’] :param output_type: annotation type the model create, e.g. dl.AnnotationType.CLASSIFICATION :param input_type: input file type the model gets, one of [‘image’, ‘video’, ‘txt’] :param default_configuration: default service configuration for the deployed services :return:
- pull(version=None, local_path=None) str [source]¶
Pull local package
Example:
path = package.pull(local_path='local_path')
- push(codebase: Optional[Union[GitCodebase, ItemCodebase]] = None, src_path: Optional[str] = None, package_name: Optional[str] = None, modules: Optional[list] = None, checkout: bool = False, revision_increment: Optional[str] = None, service_update: bool = False, service_config: Optional[dict] = None, package_type='faas')[source]¶
Push local package
- Parameters:
codebase (dtlpy.entities.codebase.Codebase) – PackageCode object - defines how to store the package code
checkout (bool) – save package to local checkout
src_path (str) – location of pacjage codebase folder to zip
package_name (str) – name of package
modules (list) – list of PackageModule
revision_increment (str) – optional - str - version bumping method - major/minor/patch - default = None
service_update (bool) – optional - bool - update the service
:param dict service_config : Service object as dict. Contains the spec of the default service to create. :param str package_type: default is “faas”, one of “app”, “ml” :return: package entity :rtype: dtlpy.entities.package.Package
Example:
package = packages.push(package_name='package_name', modules=[module], version='1.0.0', src_path=os.getcwd())
- test(cwd=None, concurrency=None, module_name='default_module', function_name='run', class_name='ServiceRunner', entry_point='main.py')[source]¶
Test local package in local environment.
- Parameters:
- Returns:
list created by the function that tested the output
- Return type:
Example:
package.test(cwd='path_to_package', function_name='run')
Package Function¶
Package Module¶
Slot¶
Codebase¶
Service¶
- class InstanceCatalog(value)[source]¶
-
The Service Pode size.
State
Description
REGULAR_XS
regular pod with extra small size
REGULAR_S
regular pod with small size
REGULAR_M
regular pod with medium size
REGULAR_L
regular pod with large size
HIGHMEM_XS
highmem pod with extra small size
HIGHMEM_S
highmem pod with small size
HIGHMEM_M
highmem pod with medium size
HIGHMEM_L
highmem pod with large size
GPU_K80_S
GPU NVIDIA K80 pod with small size
GPU_K80_M
GPU NVIDIA K80 pod with medium size
GPU_T4_S
GPU NVIDIA T4 pod with regular memory
GPU_T4_M
GPU NVIDIA T4 pod with highmem
- KubernetesAutuscalerType[source]¶
alias of
KubernetesAutoscalerType
- class OnResetAction(value)[source]¶
-
The Execution action when the service reset (RERUN, FAILED).
State
Description
RERUN
When the service resting rerun the execution
FAILED
When the service resting fail the execution
- class RuntimeType(value)[source]¶
-
Service culture Runtime (KUBERNETES).
State
Description
KUBERNETES
Service run in kubernetes culture
- class Service(created_at, updated_at, creator, version, package_id, package_revision, bot, use_user_jwt, init_input, versions, module_name, name, url, id, active, driver_id, secrets, runtime: KubernetesRuntime, queue_length_limit, run_execution_as_process: bool, execution_timeout, drain_time, on_reset: OnResetAction, type: ServiceType, project_id, org_id, is_global, max_attempts, mode, metadata, archive, config, settings, panels, package, client_api: ApiClient, revisions=None, project=None, repositories=_Nothing.NOTHING, updated_by=None, app=None, integrations=None)[source]¶
Bases:
BaseEntity
Service object
- activate_slots(project_id: Optional[str] = None, task_id: Optional[str] = None, dataset_id: Optional[str] = None, org_id: Optional[str] = None, user_email: Optional[str] = None, slots=None, role=None, prevent_override: bool = True, visible: bool = True, icon: str = 'fas fa-magic', **kwargs) object [source]¶
Activate service slots
- Parameters:
project_id (str) – project id
task_id (str) – task id
dataset_id (str) – dataset id
org_id (str) – org id
user_email (str) – user email
slots (list) – list of entities.PackageSlot
role (str) – user role MemberOrgRole.ADMIN, MemberOrgRole.owner, MemberOrgRole.MEMBER, MemberOrgRole.WORKER
prevent_override (bool) – True to prevent override
visible (bool) – visible
icon (str) – icon
kwargs – all additional arguments
- Returns:
list of user setting for activated slots
- Return type:
Example:
setting = service.activate_slots(project_id='project_id', slots=List[entities.PackageSlot], icon='fas fa-magic')
- execute(execution_input=None, function_name=None, resource=None, item_id=None, dataset_id=None, annotation_id=None, project_id=None, sync=False, stream_logs=True, return_output=True)[source]¶
Execute a function on an existing service
- Parameters:
execution_input (List[FunctionIO] or dict) – input dictionary or list of FunctionIO entities
function_name (str) – function name to run
resource (str) – input type.
item_id (str) – optional - item id as input to function
dataset_id (str) – optional - dataset id as input to function
annotation_id (str) – optional - annotation id as input to function
project_id (str) – resource’s project
sync (bool) – if true, wait for function to end
stream_logs (bool) – prints logs of the new execution. only works with sync=True
return_output (bool) – if True and sync is True - will return the output directly
- Returns:
execution object
- Return type:
Example:
execution = service.execute(function_name='function_name', item_id='item_id', project_id='project_id')
- execute_batch(filters, function_name: Optional[str] = None, execution_inputs: Optional[list] = None, wait=True)[source]¶
Execute a function on an existing service
Prerequisites: You must be in the role of an owner or developer. You must have a service.
- Parameters:
- Returns:
execution object
- Return type:
Example:
command = service.execute_batch( execution_inputs=dl.FunctionIO(type=dl.PackageInputType.STRING, value='test', name='string'), filters=dl.Filters(field='dir', values='/test', context={"datasets": [dataset.id]), function_name='run')
- classmethod from_json(_json: dict, client_api: Optional[ApiClient] = None, package=None, project=None, is_fetched=True)[source]¶
Build a service entity object from a json
- Parameters:
_json (dict) – platform json
client_api (dl.ApiClient) – ApiClient entity
package (dtlpy.entities.package.Package) – package entity
project (dtlpy.entities.project.Project) – project entity
is_fetched (bool) – is Entity fetched from Platform
- Returns:
service object
- Return type:
- log(size=None, checkpoint=None, start=None, end=None, follow=False, text=None, execution_id=None, function_name=None, replica_id=None, system=False, view=True, until_completed=True, model_id: Optional[str] = None, model_operation: Optional[str] = None)[source]¶
Get service logs
- Parameters:
size (int) – size
checkpoint (dict) – the information from the lst point checked in the service
start (str) – iso format time
end (str) – iso format time
follow (bool) – if true, keep stream future logs
text (str) – text
execution_id (str) – execution id
function_name (str) – function name
replica_id (str) – replica id
system (bool) – system
view (bool) – if true, print out all the logs
until_completed (bool) – wait until completed
model_id (str) – model id
model_operation (str) – model operation action
- Returns:
ServiceLog entity
- Return type:
Example:
service_log = service.log()
- rerun_batch(filters, wait=True)[source]¶
rerun a executions on an existing service
Prerequisites: You must be in the role of an owner or developer. You must have a Filter.
- Parameters:
filters – Filters entity for a filtering before rerun
wait (bool) – wait until create task finish
- Returns:
rerun command
- Return type:
Example:
command = service.executions.rerun_batch( filters=dl.Filters(field='id', values=['executionId'], operator=dl.FiltersOperations.IN, resource=dl.FiltersResource.EXECUTION))
- to_json()[source]¶
Returns platform _json format of object
- Returns:
platform json format of object
- Return type:
- class ServiceModeType(value)[source]¶
-
The type of the service mode.
State
Description
REGULAR
Service regular mode type
DEBUG
Service debug mode type
- class ServiceType(value)[source]¶
-
The type of the service (SYSTEM).
State
Description
SYSTEM
Dataloop internal service
Bot¶
- class Bot(created_at, updated_at, name, last_name, username, avatar, email, role, type, org, id, project, client_api=None, users=None, bots=None, password=None)[source]¶
Bases:
User
Bot entity
Trigger¶
- class BaseTrigger(id, url, created_at, updated_at, creator, name, active, type, scope, is_global, input, function_name, service_id, webhook_id, pipeline_id, special, project_id, spec, operation, service, project, client_api: ApiClient, op_type='service', repositories=_Nothing.NOTHING, updated_by=None)[source]¶
Bases:
BaseEntity
Trigger Entity
- classmethod from_json(_json, client_api, project, service=None)[source]¶
Build a trigger entity object from a json
- Parameters:
_json (dict) – platform json
client_api (dl.ApiClient) – ApiClient entity
project (dtlpy.entities.project.Project) – project entity
service (dtlpy.entities.service.Service) – service entity
- Returns:
- class CronTrigger(id, url, created_at, updated_at, creator, name, active, type, scope, is_global, input, function_name, service_id, webhook_id, pipeline_id, special, project_id, spec, operation, service, project, client_api: ApiClient, op_type='service', repositories=_Nothing.NOTHING, updated_by=None, start_at=None, end_at=None, cron=None)[source]¶
Bases:
BaseTrigger
- class Trigger(id, url, created_at, updated_at, creator, name, active, type, scope, is_global, input, function_name, service_id, webhook_id, pipeline_id, special, project_id, spec, operation, service, project, client_api: ApiClient, op_type='service', repositories=_Nothing.NOTHING, updated_by=None, filters=None, execution_mode=TriggerExecutionMode.ONCE, actions=TriggerAction.CREATED, resource=TriggerResource.ITEM)[source]¶
Bases:
BaseTrigger
Trigger Entity
- classmethod from_json(_json, client_api, project, service=None)[source]¶
Build a trigger entity object from a json
- Parameters:
_json – platform json
client_api – ApiClient entity
project (dtlpy.entities.project.Project) – project entity
service (dtlpy.entities.service.Service) – service entity
- Returns:
Execution¶
- class Execution(id, url, creator, created_at, updated_at, input, output, feedback_queue, status, status_log, sync_reply_to, latest_status, function_name, duration, attempts, max_attempts, to_terminate: bool, trigger_id, service_id, project_id, service_version, package_id, package_name, package_revision, client_api: ApiClient, service, project=None, repositories=_Nothing.NOTHING, pipeline: Optional[dict] = None, model: Optional[dict] = None)[source]¶
Bases:
BaseEntity
Service execution entity
- classmethod from_json(_json, client_api, project=None, service=None, is_fetched=True)[source]¶
- Parameters:
_json (dict) – platform json
client_api (dl.ApiClient) – ApiClient entity
project (dtlpy.entities.project.Project) – project entity
service (dtlpy.entities.service.Service) –
is_fetched – is Entity fetched from Platform
- logs(follow=False, log_level='DEBUG')[source]¶
Print logs for execution
- Parameters:
follow – keep stream future logs
log_level (str) – the log level to display
- progress_update(status: Optional[ExecutionStatus] = None, percent_complete: Optional[int] = None, message: Optional[str] = None, output: Optional[str] = None, service_version: Optional[str] = None)[source]¶
Update Execution Progress
Model¶
- class Model(id, creator, created_at, updated_at, model_artifacts, name, description, ontology_id, labels, status, tags, configuration, metadata, input_type, output_type, module_name, url, scope, version, context, package_id, project_id, dataset_id, project, package, dataset, client_api: ApiClient, repositories=_Nothing.NOTHING, ontology=None, updated_by=None, app=None)[source]¶
Bases:
BaseEntity
Model object
- add_subset(subset_name: str, subset_filter: Filters)[source]¶
Adds a subset for the model, specifying a subset of the model’s dataset that could be used for training or validation.
- Parameters:
subset_name (str) – the name of the subset
subset_filter (dtlpy.entities.Filters) – the filtering operation that this subset performs in the dataset.
Example
model.add_subset(subset_name='train', subset_filter=dtlpy.Filters(field='dir', values='/train')) model.metadata['system']['subsets'] {'train': <dtlpy.entities.filters.Filters object at 0x1501dfe20>}
- clone(model_name: str, dataset: Optional[Dataset] = None, configuration: Optional[dict] = None, status=None, scope=None, project_id: Optional[str] = None, labels: Optional[list] = None, description: Optional[str] = None, tags: Optional[list] = None, train_filter: Optional[Filters] = None, validation_filter: Optional[Filters] = None, wait=True)[source]¶
Clones and creates a new model out of existing one
- Parameters:
model_name (str) – str new model name
dataset (str) – dataset object for the cloned model
configuration (dict) – dict (optional) if passed replaces the current configuration
status (str) – str (optional) set the new status
scope (str) – str (optional) set the new scope. default is “project”
project_id (str) – str specify the project id to create the new model on (if other than the source model)
labels (list) – list of str - label of the model
description (str) – str description of the new model
tags (list) – list of str - label of the model
train_filter (dtlpy.entities.filters.Filters) – Filters entity or a dictionary to define the items’ scope in the specified dataset_id for the model train
validation_filter (dtlpy.entities.filters.Filters) – Filters entity or a dictionary to define the items’ scope in the specified dataset_id for the model validation
wait (bool) – bool wait for the model to be ready before returning
- Returns:
dl.Model which is a clone version of the existing model
- delete_subset(subset_name: str)[source]¶
Removes a subset from the model’s metadata.
- Parameters:
subset_name (str) – the name of the subset
Example
model.add_subset(subset_name='train', subset_filter=dtlpy.Filters(field='dir', values='/train')) model.metadata['system']['subsets'] {'train': <dtlpy.entities.filters.Filters object at 0x1501dfe20>} models.delete_subset(subset_name='train') metadata['system']['subsets'] {}
- deploy(service_config=None) Service [source]¶
Deploy a trained model. This will create a service that will execute predictions
:param dict service_config : Service object as dict. Contains the spec of the default service to create.
- Returns:
dl.Service: The deployed service
- evaluate(dataset_id, filters: Optional[Filters] = None, service_config=None)[source]¶
Evaluate Model, provide data to evaluate the model on You can also provide specific config for the deployed service
:param dict service_config : Service object as dict. Contains the spec of the default service to create. :param str dataset_id: ID of the dataset to evaluate :param entities.Filters filters: dl.Filter entity to run the predictions on :return:
- classmethod from_json(_json, client_api, project, package, is_fetched=True)[source]¶
Turn platform representation of model into a model entity
- Parameters:
_json – platform representation of model
client_api – ApiClient entity
project – project that owns the model
package – package entity of the model
is_fetched – is Entity fetched from Platform
- Returns:
Model entity
- log(service=None, size=None, checkpoint=None, start=None, end=None, follow=False, text=None, execution_id=None, function_name=None, replica_id=None, system=False, view=True, until_completed=True, model_operation: Optional[str] = None)[source]¶
Get service logs
- Parameters:
service – service object
size (int) – size
checkpoint (dict) – the information from the lst point checked in the service
start (str) – iso format time
end (str) – iso format time
follow (bool) – if true, keep stream future logs
text (str) – text
execution_id (str) – execution id
function_name (str) – function name
replica_id (str) – replica id
system (bool) – system
view (bool) – if true, print out all the logs
until_completed (bool) – wait until completed
model_operation (str) – model operation action
- Returns:
ServiceLog entity
- Return type:
Example:
service_log = service.log()
- predict(item_ids)[source]¶
Run model prediction with items
- Parameters:
item_ids – a list of item id to run the prediction.
- Returns:
- train(service_config=None)[source]¶
Train the model in the cloud. This will create a service and will run the adapter’s train function as an execution
:param dict service_config : Service object as dict. Contains the spec of the default service to create. :return:
Pipeline¶
- class Pipeline(id, name, creator, org_id, connections, settings: PipelineSettings, variables: List[Variable], status: CompositionStatus, created_at, updated_at, start_nodes, project_id, composition_id, url, preview, description, revisions, project, client_api: ApiClient, original_settings: PipelineSettings, original_variables: List[Variable], repositories=_Nothing.NOTHING, updated_by=None)[source]¶
Bases:
BaseEntity
Pipeline object
- execute(execution_input=None)[source]¶
execute a pipeline and return to execute
- Parameters:
execution_input – list of the dl.FunctionIO or dict of pipeline input - example {‘item’: ‘item_id’}
- Returns:
entities.PipelineExecution object
- execute_batch(filters, execution_inputs=None, wait=True)[source]¶
execute a pipeline and return to execute
- Parameters:
execution_inputs – list of the dl.FunctionIO or dict of pipeline input - example {‘item’: ‘item_id’}, that represent the extra inputs of the function
filters – Filters entity for a filtering before execute
wait (bool) – wait until create task finish
- Returns:
entities.PipelineExecution object
Example:
command = pipeline.execute_batch( execution_inputs=dl.FunctionIO(type=dl.PackageInputType.STRING, value='test', name='string'), filters=dl.Filters(field='dir', values='/test', context={'datasets': [dataset.id]))
- classmethod from_json(_json, client_api, project, is_fetched=True)[source]¶
Turn platform representation of pipeline into a pipeline entity
- Parameters:
_json (dict) – platform representation of package
client_api (dl.ApiClient) – ApiClient entity
project (dtlpy.entities.project.Project) – project entity
is_fetched (bool) – is Entity fetched from Platform
- Returns:
Pipeline entity
- Return type:
- install(resume_option: Optional[PipelineResumeOption] = None)[source]¶
install pipeline
- Returns:
Composition entity
- pause(keep_triggers_active: Optional[bool] = None)[source]¶
pause pipeline
- Returns:
Composition entity
- reset(stop_if_running: bool = False)[source]¶
Resets pipeline counters
- Parameters:
stop_if_running (bool) – If the pipeline is installed it will stop the pipeline and reset the counters.
- Returns:
bool
- set_start_node(node: PipelineNode)[source]¶
Set the start node of the pipeline
- Parameters:
node (PipelineNode) – node to be the start node
- stats()[source]¶
Get pipeline counters
- Returns:
PipelineStats
- Return type:
dtlpy.entities.pipeline.PipelineStats
- class Variable(_dict=None, **kwargs)[source]¶
Bases:
DlEntity
Pipeline Variables
Pipeline Execution¶
- class PipelineExecution(id, nodes, executions, status, created_at, updated_at, pipeline_id, max_attempts, creator, pipeline, project, client_api: ApiClient, repositories=_Nothing.NOTHING)[source]¶
Bases:
BaseEntity
Package object
- classmethod from_json(_json, client_api, pipeline, is_fetched=True) PipelineExecution [source]¶
Turn platform representation of pipeline_execution into a pipeline_execution entity
- Parameters:
_json (dict) – platform representation of package
client_api (dl.ApiClient) – ApiClient entity
pipeline (dtlpy.entities.pipeline.Pipeline) – Pipeline entity
is_fetched (bool) – is Entity fetched from Platform
- Returns:
Pipeline entity
- Return type:
dtlpy.entities.PipelineExecution
- rerun(method: Optional[str] = None, start_nodes_ids: Optional[list] = None, wait: bool = True) bool [source]¶
Get Pipeline Execution object
prerequisites: You must be an owner or developer to use this method.
- Parameters:
- Returns:
True if success
- Return type:
Example:
pipeline_executions.rerun(method=dl.CycleRerunMethod.START_FROM_BEGINNING,)
Other¶
Pages¶
- class PagedEntities(client_api: ApiClient, page_offset, page_size, filters, items_repository, has_next_page=False, total_pages_count=0, items_count=0, service_id=None, project_id=None, order_by_type=None, order_by_direction=None, execution_status=None, execution_resource_type=None, execution_resource_id=None, execution_function_name=None, list_function=None, items=[])[source]¶
Bases:
object
Pages object
- get_page(page_offset=None, page_size=None)[source]¶
Get page
- Parameters:
page_offset – page offset
page_size – page size
Base Entity¶
Command¶
- class Command(id, url, status, created_at, updated_at, type, progress, spec, error, client_api: ApiClient, repositories=_Nothing.NOTHING)[source]¶
Bases:
BaseEntity
Com entity
- classmethod from_json(_json, client_api, is_fetched=True)[source]¶
Build a Command entity object from a json
- Parameters:
_json – _json response from host
client_api – ApiClient entity
is_fetched – is Entity fetched from Platform
- Returns:
Command object
- in_progress()[source]¶
Check if command is still in one of the in progress statuses
- Returns:
True if command still in progress
- Return type:
- to_json()[source]¶
Returns platform _json format of object
- Returns:
platform json format of object
- Return type: