aiida.orm package#
Main module to expose all orm classes and methods
Subpackages#
Submodules#
Module for the AuthInfo ORM class.
- class aiida.orm.authinfos.AuthInfo(computer: Computer, user: User, backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Entity
[BackendAuthInfo
]ORM class that models the authorization information that allows a User to connect to a Computer.
- PROPERTY_WORKDIR = 'workdir'#
- _CLS_COLLECTION#
- __abstractmethods__ = frozenset({})#
- __init__(computer: Computer, user: User, backend: Optional[StorageBackend] = None) None [source]#
Create an AuthInfo instance for the given computer and user.
- Parameters
computer – a Computer instance
user – a User instance
backend – the backend to use for the instance, or use the default backend if None
- __module__ = 'aiida.orm.authinfos'#
- __orig_bases__ = (aiida.orm.entities.Entity[ForwardRef('BackendAuthInfo')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- property enabled: bool#
Return whether this instance is enabled.
- Returns
True if enabled, False otherwise
- get_auth_params() Dict[str, Any] [source]#
Return the dictionary of authentication parameters
- Returns
a dictionary with authentication parameters
- get_metadata() Dict[str, Any] [source]#
Return the dictionary of metadata
- Returns
a dictionary with metadata
- get_transport() Transport [source]#
Return a fully configured transport that can be used to connect to the computer set for this instance.
- get_workdir() str [source]#
Return the working directory.
If no explicit work directory is set for this instance, the working directory of the computer will be returned.
- Returns
the working directory
- set_auth_params(auth_params: Dict[str, Any]) None [source]#
Set the dictionary of authentication parameters
- Parameters
auth_params – a dictionary with authentication parameters
- class aiida.orm.authinfos.AuthInfoCollection(entity_class: Type[aiida.orm.entities.EntityType], backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Collection
[AuthInfo
]The collection of AuthInfo entries.
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.orm.authinfos'#
- __orig_bases__ = (aiida.orm.entities.Collection[ForwardRef('AuthInfo')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- static _entity_base_cls() Type[aiida.orm.authinfos.AuthInfo] [source]#
The allowed entity class or subclasses thereof.
Module to manage the autogrouping functionality by verdi run
.
- class aiida.orm.autogroup.AutogroupManager(backend)[source]#
Bases:
object
Class to automatically add all newly stored
Node``s to an ``AutoGroup
(whilst enabled).This class should not be instantiated directly, but rather accessed through the backend storage instance.
The auto-grouping is checked by the
Node.store()
method which, ifis_to_be_grouped
is true, will store the node in the associatedAutoGroup
.The exclude/include lists are lists of strings like:
aiida.data:core.int
,aiida.calculation:quantumespresso.pw
,aiida.data:core.array.%
, … i.e.: a string identifying the base class, followed by a colon and the path to the class as accepted by CalculationFactory/DataFactory. Each string can contain one or more wildcard characters%
; in this case this is used in alike
comparison with the QueryBuilder. Note that in this case you have to remember that_
means “any character” in the QueryBuilder, and you need to escape it if you mean a literal underscore.Only one of the two (between exclude and include) can be set. If none of the two is set, everything is included.
- __dict__ = mappingproxy({'__module__': 'aiida.orm.autogroup', '__doc__': 'Class to automatically add all newly stored ``Node``s to an ``AutoGroup`` (whilst enabled).\n\n This class should not be instantiated directly, but rather accessed through the backend storage instance.\n\n The auto-grouping is checked by the ``Node.store()`` method which, if ``is_to_be_grouped`` is true,\n will store the node in the associated ``AutoGroup``.\n\n The exclude/include lists are lists of strings like:\n ``aiida.data:core.int``, ``aiida.calculation:quantumespresso.pw``,\n ``aiida.data:core.array.%``, ...\n i.e.: a string identifying the base class, followed by a colon and the path to the class\n as accepted by CalculationFactory/DataFactory.\n Each string can contain one or more wildcard characters ``%``;\n in this case this is used in a ``like`` comparison with the QueryBuilder.\n Note that in this case you have to remember that ``_`` means "any character"\n in the QueryBuilder, and you need to escape it if you mean a literal underscore.\n\n Only one of the two (between exclude and include) can be set.\n If none of the two is set, everything is included.\n ', '__init__': <function AutogroupManager.__init__>, 'is_enabled': <property object>, 'enable': <function AutogroupManager.enable>, 'disable': <function AutogroupManager.disable>, 'get_exclude': <function AutogroupManager.get_exclude>, 'get_include': <function AutogroupManager.get_include>, 'get_group_label_prefix': <function AutogroupManager.get_group_label_prefix>, 'validate': <staticmethod object>, 'set_exclude': <function AutogroupManager.set_exclude>, 'set_include': <function AutogroupManager.set_include>, 'set_group_label_prefix': <function AutogroupManager.set_group_label_prefix>, '_matches': <staticmethod object>, 'is_to_be_grouped': <function AutogroupManager.is_to_be_grouped>, 'get_or_create_group': <function AutogroupManager.get_or_create_group>, '__dict__': <attribute '__dict__' of 'AutogroupManager' objects>, '__weakref__': <attribute '__weakref__' of 'AutogroupManager' objects>, '__annotations__': {'_exclude': 'Optional[List[str]]', '_include': 'Optional[List[str]]'}})#
- __module__ = 'aiida.orm.autogroup'#
- __weakref__#
list of weak references to the object (if defined)
- static _matches(string, filter_string)[source]#
Check if ‘string’ matches the ‘filter_string’ (used for include and exclude filters).
If ‘filter_string’ does not contain any % sign, perform an exact match. Otherwise, match with a SQL-like query, where % means any character sequence, and _ means a single character (these characters can be escaped with a backslash).
- Parameters
string – the string to match.
filter_string – the filter string.
- get_exclude() Optional[List[str]] [source]#
Return the list of classes to exclude from autogrouping.
Returns
None
if no exclusion list has been set.
- get_group_label_prefix() str [source]#
Get the prefix of the label of the group. If no group label prefix was set, it will set a default one by itself.
- get_include() Optional[List[str]] [source]#
Return the list of classes to include in the autogrouping.
Returns
None
if no inclusion list has been set.
- get_or_create_group() aiida.orm.groups.AutoGroup [source]#
Return the current AutoGroup, or create one if None has been set yet.
This function implements a somewhat complex logic that is however needed to make sure that, even if verdi run is called at the same time multiple times, e.g. in a for loop in bash, there is never the risk that two
verdi run
Unix processes try to create the same group, with the same label, ending up in a crash of the code (see PR #3650).Here, instead, we make sure that if this concurrency issue happens, one of the two will get a IntegrityError from the DB, and then recover trying to create a group with a different label (with a numeric suffix appended), until it manages to create it.
- is_to_be_grouped(node) bool [source]#
Return whether the given node is to be auto-grouped according to enable state and include/exclude lists.
- set_exclude(exclude: Optional[List[str]]) None [source]#
Set the list of classes to exclude in the autogrouping.
- Parameters
exclude – a list of valid entry point strings (might contain ‘%’ to be used as string to be matched using SQL’s
LIKE
pattern-making logic), orNone
to specify no include list.
- set_group_label_prefix(label_prefix: Optional[str]) None [source]#
Set the label of the group to be created (or use a default).
Comment objects and functions
- class aiida.orm.comments.Comment(node: Node, user: User, content: Optional[str] = None, backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Entity
[BackendComment
]Base class to map a DbComment that represents a comment attached to a certain Node.
- _CLS_COLLECTION#
alias of
aiida.orm.comments.CommentCollection
- __abstractmethods__ = frozenset({})#
- __init__(node: Node, user: User, content: Optional[str] = None, backend: Optional[StorageBackend] = None)[source]#
Create a Comment for a given node and user
- Parameters
node – a Node instance
user – a User instance
content – the comment content
backend – the backend to use for the instance, or use the default backend if None
- Returns
a Comment object associated to the given node and user
- __module__ = 'aiida.orm.comments'#
- __orig_bases__ = (aiida.orm.entities.Entity[ForwardRef('BackendComment')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- property ctime: datetime.datetime#
- property mtime: datetime.datetime#
- set_mtime(value: datetime.datetime) None [source]#
- class aiida.orm.comments.CommentCollection(entity_class: Type[aiida.orm.entities.EntityType], backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Collection
[Comment
]The collection of Comment entries.
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.orm.comments'#
- __orig_bases__ = (aiida.orm.entities.Collection[ForwardRef('Comment')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- static _entity_base_cls() Type[aiida.orm.comments.Comment] [source]#
The allowed entity class or subclasses thereof.
- delete(pk: int) None [source]#
Remove a Comment from the collection with the given id
- Parameters
pk – the id of the comment to delete
- Raises
TypeError – if
comment_id
is not an intNotExistent – if Comment with ID
comment_id
is not found
- delete_all() None [source]#
Delete all Comments from the Collection
- Raises
IntegrityError – if all Comments could not be deleted
Module for Computer entities
- class aiida.orm.computers.Computer(label: str = None, hostname: str = '', description: str = '', transport_type: str = '', scheduler_type: str = '', workdir: str = None, backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Entity
[BackendComputer
]Computer entity.
- PROPERTY_MINIMUM_SCHEDULER_POLL_INTERVAL = 'minimum_scheduler_poll_interval'#
- PROPERTY_MINIMUM_SCHEDULER_POLL_INTERVAL__DEFAULT = 10.0#
- PROPERTY_SHEBANG = 'shebang'#
- PROPERTY_WORKDIR = 'workdir'#
- _CLS_COLLECTION#
- __abstractmethods__ = frozenset({})#
- __init__(label: str = None, hostname: str = '', description: str = '', transport_type: str = '', scheduler_type: str = '', workdir: str = None, backend: Optional[StorageBackend] = None) None [source]#
Construct a new computer.
- __module__ = 'aiida.orm.computers'#
- __orig_bases__ = (aiida.orm.entities.Entity[ForwardRef('BackendComputer')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- classmethod _append_text_validator(append_text: str) None [source]#
Validates the append text string.
- classmethod _default_mpiprocs_per_machine_validator(def_cpus_per_machine: Optional[int]) None [source]#
Validates the default number of CPUs per machine (node)
- _logger = <Logger aiida.orm.computers (REPORT)>#
- _mpirun_command_validator(mpirun_cmd: Union[List[str], Tuple[str, ...]]) None [source]#
Validates the mpirun_command variable. MUST be called after properly checking for a valid scheduler.
- classmethod _prepend_text_validator(prepend_text: str) None [source]#
Validates the prepend text string.
- classmethod _scheduler_type_validator(scheduler_type: str) None [source]#
Validates the transport string.
- classmethod _transport_type_validator(transport_type: str) None [source]#
Validates the transport string.
- configure(user: Optional[User] = None, **kwargs: Any) AuthInfo [source]#
Configure a computer for a user with valid auth params passed via kwargs
- Parameters
user – the user to configure the computer for
- Kwargs
the configuration keywords with corresponding values
- Returns
the authinfo object for the configured user
- copy() aiida.orm.computers.Computer [source]#
Return a copy of the current object to work with, not stored yet.
- classmethod default_memory_per_machine_validator(def_memory_per_machine: Optional[int]) None [source]#
Validates the default amount of memory (kB) per machine (node)
- delete_property(name: str, raise_exception: bool = True) None [source]#
Delete a property from this computer
- Parameters
name – the name of the property
raise_exception – if True raise if the property does not exist, otherwise return None
- get_authinfo(user: User) AuthInfo [source]#
Return the aiida.orm.authinfo.AuthInfo instance for the given user on this computer, if the computer is configured for the given user.
- Parameters
user – a User instance.
- Returns
a AuthInfo instance
- Raises
aiida.common.NotExistent – if the computer is not configured for the given user.
- get_configuration(user: Optional[User] = None) Dict[str, Any] [source]#
Get the configuration of computer for the given user as a dictionary
- Parameters
user – the user to to get the configuration for, otherwise default user
- get_default_memory_per_machine() Optional[int] [source]#
Return the default amount of memory (kB) per machine (node) for this computer, or None if it was not set.
- get_default_mpiprocs_per_machine() Optional[int] [source]#
Return the default number of CPUs per machine (node) for this computer, or None if it was not set.
- get_minimum_job_poll_interval() float [source]#
Get the minimum interval between subsequent requests to poll the scheduler for job status.
Note
If no value was ever set for this computer it will fall back on the default provided by the associated transport class in the
DEFAULT_MINIMUM_JOB_POLL_INTERVAL
attribute. If the computer doesn’t have a transport class, or it cannot be loaded, or it doesn’t provide a job poll interval default, then this will fall back on thePROPERTY_MINIMUM_SCHEDULER_POLL_INTERVAL__DEFAULT
attribute of this class.- Returns
The minimum interval (in seconds).
- get_mpirun_command() List[str] [source]#
Return the mpirun command. Must be a list of strings, that will be then joined with spaces when submitting.
I also provide a sensible default that may be ok in many cases.
- get_property(name: str, *args: Any) Any [source]#
Get a property of this computer
- Parameters
name – the property name
args – additional arguments
- Returns
the property value
- get_transport(user: Optional[User] = None) Transport [source]#
Return a Transport class, configured with all correct parameters. The Transport is closed (meaning that if you want to run any operation with it, you have to open it first (i.e., e.g. for a SSH transport, you have to open a connection). To do this you can call
transports.open()
, or simply run within awith
statement:transport = Computer.get_transport() with transport: print(transports.whoami())
- Parameters
user – if None, try to obtain a transport for the default user. Otherwise, pass a valid User.
- Returns
a (closed) Transport, already configured with the connection parameters to the supercomputer, as configured with
verdi computer configure
for the user specified as a parameteruser
.
- get_transport_class() Type[Transport] [source]#
Get the transport class for this computer. Can be used to instantiate a transport instance.
- get_use_double_quotes() bool [source]#
Return whether the command line parameters of this computer should be escaped with double quotes.
- Returns
True if to escape with double quotes, False otherwise which is also the default.
- get_workdir() str [source]#
Get the working directory for this computer :return: The currently configured working directory
- is_user_configured(user: User) bool [source]#
Is the user configured on this computer?
- Parameters
user – the user to check
- Returns
True if configured, False otherwise
- is_user_enabled(user: User) bool [source]#
Is the given user enabled to run on this computer?
- Parameters
user – the user to check
- Returns
True if enabled, False otherwise
- property logger: logging.Logger#
- set_default_memory_per_machine(def_memory_per_machine: Optional[int]) None [source]#
Set the default amount of memory (kB) per machine (node) for this computer. Accepts None if you do not want to set this value.
- set_default_mpiprocs_per_machine(def_cpus_per_machine: Optional[int]) None [source]#
Set the default number of CPUs per machine (node) for this computer. Accepts None if you do not want to set this value.
- set_minimum_job_poll_interval(interval: float) None [source]#
Set the minimum interval between subsequent requests to update the list of jobs currently running on this computer.
- Parameters
interval – The minimum interval in seconds
- set_mpirun_command(val: Union[List[str], Tuple[str, ...]]) None [source]#
Set the mpirun command. It must be a list of strings (you can use string.split() if you have a single, space-separated string).
- set_property(name: str, value: Any) None [source]#
Set a property on this computer
- Parameters
name – the property name
value – the new value
- set_use_double_quotes(val: bool) None [source]#
Set whether the command line parameters of this computer should be escaped with double quotes.
- Parameters
use_double_quotes – True if to escape with double quotes, False otherwise.
- store() aiida.orm.computers.Computer [source]#
Store the computer in the DB.
Differently from Nodes, a computer can be re-stored if its properties are to be changed (e.g. a new mpirun command, etc.)
- property uuid: str#
Return the UUID for this computer.
This identifier is unique across all entities types and backend instances.
- Returns
the entity uuid
- validate() None [source]#
Check if the attributes and files retrieved from the DB are valid. Raise a ValidationError if something is wrong.
Must be able to work even before storing: therefore, use the get_attr and similar methods that automatically read either from the DB or from the internal attribute cache.
For the base class, this is always valid. Subclasses will reimplement this. In the subclass, always call the super().validate() method first!
- class aiida.orm.computers.ComputerCollection(entity_class: Type[aiida.orm.entities.EntityType], backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Collection
[Computer
]The collection of Computer entries.
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.orm.computers'#
- __orig_bases__ = (aiida.orm.entities.Collection[ForwardRef('Computer')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- static _entity_base_cls() Type[aiida.orm.computers.Computer] [source]#
The allowed entity class or subclasses thereof.
- get_or_create(label: Optional[str] = None, **kwargs) Tuple[bool, aiida.orm.computers.Computer] [source]#
Try to retrieve a Computer from the DB with the given arguments; create (and store) a new Computer if such a Computer was not present yet.
- Parameters
label – computer label
- Returns
(computer, created) where computer is the computer (new or existing, in any case already stored) and created is a boolean saying
Module for converting backend entities into frontend, ORM, entities
- class aiida.orm.convert.ConvertIterator(backend_iterator)[source]#
Bases:
collections.abc.Iterator
,collections.abc.Sized
Iterator that converts backend entities into frontend ORM entities as needed
See
aiida.orm.Group.nodes()
for an example.- __abstractmethods__ = frozenset({})#
- __dict__ = mappingproxy({'__module__': 'aiida.orm.convert', '__doc__': '\n Iterator that converts backend entities into frontend ORM entities as needed\n\n See :func:`aiida.orm.Group.nodes` for an example.\n ', '__init__': <function ConvertIterator.__init__>, '_genfunction': <function ConvertIterator._genfunction>, '__iter__': <function ConvertIterator.__iter__>, '__len__': <function ConvertIterator.__len__>, '__getitem__': <function ConvertIterator.__getitem__>, '__next__': <function ConvertIterator.__next__>, '__dict__': <attribute '__dict__' of 'ConvertIterator' objects>, '__weakref__': <attribute '__weakref__' of 'ConvertIterator' objects>, '__abstractmethods__': frozenset(), '_abc_impl': <_abc_data object>, '__annotations__': {}})#
- __module__ = 'aiida.orm.convert'#
- __weakref__#
list of weak references to the object (if defined)
- _abc_impl = <_abc_data object>#
- aiida.orm.convert.get_orm_entity(backend_entity)[source]#
- aiida.orm.convert.get_orm_entity(backend_entity: collections.abc.Mapping)
- aiida.orm.convert.get_orm_entity(backend_entity: tuple)
- aiida.orm.convert.get_orm_entity(backend_entity: list)
- aiida.orm.convert.get_orm_entity(backend_entity: aiida.orm.implementation.groups.BackendGroup)
- aiida.orm.convert.get_orm_entity(backend_entity: aiida.orm.implementation.computers.BackendComputer)
- aiida.orm.convert.get_orm_entity(backend_entity: aiida.orm.implementation.users.BackendUser)
- aiida.orm.convert.get_orm_entity(backend_entity: aiida.orm.implementation.authinfos.BackendAuthInfo)
- aiida.orm.convert.get_orm_entity(backend_entity: aiida.orm.implementation.logs.BackendLog)
- aiida.orm.convert.get_orm_entity(backend_entity: aiida.orm.implementation.comments.BackendComment)
- aiida.orm.convert.get_orm_entity(backend_entity: aiida.orm.implementation.nodes.BackendNode)
Module for all common top level AiiDA entity classes and methods
- class aiida.orm.entities.Collection(entity_class: Type[aiida.orm.entities.EntityType], backend: Optional[StorageBackend] = None)[source]#
Bases:
abc.ABC
,Generic
[aiida.orm.entities.EntityType
]Container class that represents the collection of objects of a particular entity type.
- __abstractmethods__ = frozenset({'_entity_base_cls'})#
- __call__(backend: StorageBackend) aiida.orm.entities.CollectionType [source]#
Get or create a cached collection using a new backend.
- __dict__ = mappingproxy({'__module__': 'aiida.orm.entities', '__doc__': 'Container class that represents the collection of objects of a particular entity type.', '_entity_base_cls': <staticmethod object>, 'get_cached': <classmethod object>, '__init__': <function Collection.__init__>, '__call__': <function Collection.__call__>, 'entity_type': <property object>, 'backend': <property object>, 'query': <function Collection.query>, 'get': <function Collection.get>, 'find': <function Collection.find>, 'all': <function Collection.all>, 'count': <function Collection.count>, '__orig_bases__': (<class 'abc.ABC'>, typing.Generic[~EntityType]), '__dict__': <attribute '__dict__' of 'Collection' objects>, '__weakref__': <attribute '__weakref__' of 'Collection' objects>, '__parameters__': (~EntityType,), '__abstractmethods__': frozenset({'_entity_base_cls'}), '_abc_impl': <_abc_data object>, '__annotations__': {}})#
- __init__(entity_class: Type[aiida.orm.entities.EntityType], backend: Optional[StorageBackend] = None) None [source]#
Construct a new entity collection.
- Parameters
entity_class – the entity type e.g. User, Computer, etc
backend – the backend instance to get the collection for, or use the default
- __module__ = 'aiida.orm.entities'#
- __orig_bases__ = (<class 'abc.ABC'>, typing.Generic[~EntityType])#
- __parameters__ = (~EntityType,)#
- __weakref__#
list of weak references to the object (if defined)
- _abc_impl = <_abc_data object>#
- abstract static _entity_base_cls() Type[aiida.orm.entities.EntityType] [source]#
The allowed entity class or subclasses thereof.
- all() List[aiida.orm.entities.EntityType] [source]#
Get all entities in this collection.
- Returns
A list of all entities
- property backend: StorageBackend#
Return the backend.
- count(filters: Optional[FilterType] = None) int [source]#
Count entities in this collection according to criteria.
- Parameters
filters – the keyword value pair filters to match
- Returns
The number of entities found using the supplied criteria
- property entity_type: Type[aiida.orm.entities.EntityType]#
The entity type for this instance.
- find(filters: Optional[FilterType] = None, order_by: Optional[OrderByType] = None, limit: Optional[int] = None) List[aiida.orm.entities.EntityType] [source]#
Find collection entries matching the filter criteria.
- Parameters
filters – the keyword value pair filters to match
order_by – a list of (key, direction) pairs specifying the sort order
limit – the maximum number of results to return
- Returns
a list of resulting matches
- get(**filters: Any) aiida.orm.entities.EntityType [source]#
Get a single collection entry that matches the filter criteria.
- Parameters
filters – the filters identifying the object to get
- Returns
the entry
- classmethod get_cached(entity_class: Type[aiida.orm.entities.EntityType], backend: StorageBackend)[source]#
Get the cached collection instance for the given entity class and backend.
- Parameters
backend – the backend instance to get the collection for
- query(filters: Optional[FilterType] = None, order_by: Optional[OrderByType] = None, limit: Optional[int] = None, offset: Optional[int] = None) QueryBuilder [source]#
Get a query builder for the objects of this collection.
- Parameters
filters – the keyword value pair filters to match
order_by – a list of (key, direction) pairs specifying the sort order
limit – the maximum number of results to return
offset – number of initial results to be skipped
- class aiida.orm.entities.Entity(backend_entity: aiida.orm.entities.BackendEntityType)[source]#
Bases:
abc.ABC
,Generic
[aiida.orm.entities.BackendEntityType
]An AiiDA entity
- _CLS_COLLECTION#
alias of
aiida.orm.entities.Collection
- __abstractmethods__ = frozenset({})#
- __dict__ = mappingproxy({'__module__': 'aiida.orm.entities', '__doc__': 'An AiiDA entity', '_CLS_COLLECTION': <class 'aiida.orm.entities.Collection'>, 'objects': <aiida.common.lang.classproperty object>, 'collection': <aiida.common.lang.classproperty object>, 'get': <classmethod object>, 'from_backend_entity': <classmethod object>, '__init__': <function Entity.__init__>, 'initialize': <function super_check.<locals>.wrapper>, 'id': <property object>, 'pk': <property object>, 'store': <function Entity.store>, 'is_stored': <property object>, 'backend': <property object>, 'backend_entity': <property object>, '__orig_bases__': (<class 'abc.ABC'>, typing.Generic[~BackendEntityType]), '__dict__': <attribute '__dict__' of 'Entity' objects>, '__weakref__': <attribute '__weakref__' of 'Entity' objects>, '__parameters__': (~BackendEntityType,), '__abstractmethods__': frozenset(), '_abc_impl': <_abc_data object>, '__annotations__': {}})#
- __init__(backend_entity: aiida.orm.entities.BackendEntityType) None [source]#
- Parameters
backend_entity – the backend model supporting this entity
- __module__ = 'aiida.orm.entities'#
- __orig_bases__ = (<class 'abc.ABC'>, typing.Generic[~BackendEntityType])#
- __parameters__ = (~BackendEntityType,)#
- __weakref__#
list of weak references to the object (if defined)
- _abc_impl = <_abc_data object>#
- property backend: StorageBackend#
Get the backend for this entity
- property backend_entity: aiida.orm.entities.BackendEntityType#
Get the implementing class for this object
- collection#
A class that, when used as a decorator, works as if the two decorators @property and @classmethod where applied together (i.e., the object works as a property, both for the Class and for any of its instance; and is called with the class cls rather than with the instance as its first argument).
- classmethod from_backend_entity(backend_entity: aiida.orm.entities.BackendEntityType) aiida.orm.entities.EntityType [source]#
Construct an entity from a backend entity instance
- Parameters
backend_entity – the backend entity
- Returns
an AiiDA entity instance
- property id: int#
Return the id for this entity.
This identifier is guaranteed to be unique amongst entities of the same type for a single backend instance.
- Returns
the entity’s id
- objects#
A class that, when used as a decorator, works as if the two decorators @property and @classmethod where applied together (i.e., the object works as a property, both for the Class and for any of its instance; and is called with the class cls rather than with the instance as its first argument).
- class aiida.orm.entities.EntityTypes(value)[source]#
Bases:
enum.Enum
Enum for referring to ORM entities in a backend-agnostic manner.
- AUTHINFO = 'authinfo'#
- COMMENT = 'comment'#
- COMPUTER = 'computer'#
- GROUP = 'group'#
- GROUP_NODE = 'group_node'#
- LINK = 'link'#
- LOG = 'log'#
- NODE = 'node'#
- USER = 'user'#
- __module__ = 'aiida.orm.entities'#
Interface to the extras of a node instance.
- class aiida.orm.extras.EntityExtras(entity: Union[Node, Group])[source]#
Bases:
object
Interface to the extras of a node or group instance.
Extras are a JSONable dictionary, stored on each entity, allowing for arbitrary data to be stored by users.
Extras are mutable, even after storing the entity, and as such are not deemed a core part of the provenance graph.
- __dict__ = mappingproxy({'__module__': 'aiida.orm.extras', '__doc__': 'Interface to the extras of a node or group instance.\n\n Extras are a JSONable dictionary, stored on each entity,\n allowing for arbitrary data to be stored by users.\n\n Extras are mutable, even after storing the entity,\n and as such are not deemed a core part of the provenance graph.\n ', '__init__': <function EntityExtras.__init__>, '__contains__': <function EntityExtras.__contains__>, 'all': <property object>, 'get': <function EntityExtras.get>, 'get_many': <function EntityExtras.get_many>, 'set': <function EntityExtras.set>, 'set_many': <function EntityExtras.set_many>, 'reset': <function EntityExtras.reset>, 'delete': <function EntityExtras.delete>, 'delete_many': <function EntityExtras.delete_many>, 'clear': <function EntityExtras.clear>, 'items': <function EntityExtras.items>, 'keys': <function EntityExtras.keys>, '__dict__': <attribute '__dict__' of 'EntityExtras' objects>, '__weakref__': <attribute '__weakref__' of 'EntityExtras' objects>, '__annotations__': {}})#
- __module__ = 'aiida.orm.extras'#
- __weakref__#
list of weak references to the object (if defined)
- property all: Dict[str, Any]#
Return the complete extras dictionary.
Warning
While the entity is unstored, this will return references of the extras on the database model, meaning that changes on the returned values (if they are mutable themselves, e.g. a list or dictionary) will automatically be reflected on the database model as well. As soon as the entity is stored, the returned extras will be a deep copy and mutations of the database extras will have to go through the appropriate set methods. Therefore, once stored, retrieving a deep copy can be a heavy operation. If you only need the keys or some values, use the iterators extras_keys and extras_items, or the getters get_extra and get_extra_many instead.
- Returns
the extras as a dictionary
- delete(key: str) None [source]#
Delete an extra.
- Parameters
key – name of the extra
- Raises
AttributeError – if the extra does not exist
- delete_many(keys: List[str]) None [source]#
Delete multiple extras.
- Parameters
keys – names of the extras to delete
- Raises
AttributeError – if at least one of the extra does not exist
- get(key: str, default: Any = ()) Any [source]#
Return the value of an extra.
Warning
While the entity is unstored, this will return a reference of the extra on the database model, meaning that changes on the returned value (if they are mutable themselves, e.g. a list or dictionary) will automatically be reflected on the database model as well. As soon as the entity is stored, the returned extra will be a deep copy and mutations of the database extras will have to go through the appropriate set methods.
- Parameters
key – name of the extra
default – return this value instead of raising if the attribute does not exist
- Returns
the value of the extra
- Raises
AttributeError – if the extra does not exist and no default is specified
- get_many(keys: List[str]) List[Any] [source]#
Return the values of multiple extras.
Warning
While the entity is unstored, this will return references of the extras on the database model, meaning that changes on the returned values (if they are mutable themselves, e.g. a list or dictionary) will automatically be reflected on the database model as well. As soon as the entity is stored, the returned extras will be a deep copy and mutations of the database extras will have to go through the appropriate set methods. Therefore, once stored, retrieving a deep copy can be a heavy operation. If you only need the keys or some values, use the iterators extras_keys and extras_items, or the getters get_extra and get_extra_many instead.
- Parameters
keys – a list of extra names
- Returns
a list of extra values
- Raises
AttributeError – if at least one extra does not exist
- items() Iterator[Tuple[str, Any]] [source]#
Return an iterator over the extras.
- Returns
an iterator with extra key value pairs
- keys() Iterable[str] [source]#
Return an iterator over the extra keys.
- Returns
an iterator with extra keys
- reset(extras: Dict[str, Any]) None [source]#
Reset the extras.
Note
This will completely clear any existing extras and replace them with the new dictionary.
- Parameters
extras – a dictionary with the extras to set
- Raises
aiida.common.ValidationError – if any of the keys are invalid, i.e. contain periods
AiiDA Group entites
- class aiida.orm.groups.AutoGroup(label: Optional[str] = None, user: Optional[User] = None, description: str = '', type_string: Optional[str] = None, backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.groups.Group
Group to be used to contain selected nodes generated, whilst autogrouping is enabled.
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.orm.groups'#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- class aiida.orm.groups.Group(label: Optional[str] = None, user: Optional[User] = None, description: str = '', type_string: Optional[str] = None, backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Entity
[BackendGroup
]An AiiDA ORM implementation of group of nodes.
- _CLS_COLLECTION#
alias of
aiida.orm.groups.GroupCollection
- __abstractmethods__ = frozenset({})#
- __annotations__ = {'_type_string': typing.ClassVar[typing.Union[str, NoneType]]}#
- __getattr__(name: str) Any [source]#
This method is called when an extras is not found in the instance.
It allows for the handling of deprecated mixin methods.
- __init__(label: Optional[str] = None, user: Optional[User] = None, description: str = '', type_string: Optional[str] = None, backend: Optional[StorageBackend] = None)[source]#
Create a new group. Either pass a dbgroup parameter, to reload a group from the DB (and then, no further parameters are allowed), or pass the parameters for the Group creation.
- Parameters
label – The group label, required on creation
description – The group description (by default, an empty string)
user – The owner of the group (by default, the automatic user)
type_string – a string identifying the type of group (by default, an empty string, indicating an user-defined group.
- __module__ = 'aiida.orm.groups'#
- __orig_bases__ = (aiida.orm.entities.Entity[ForwardRef('BackendGroup')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- _deprecated_extra_methods = {'clear_extras': 'clear', 'delete_extra': 'delete', 'delete_extra_many': 'delete_many', 'extras': 'all', 'extras_items': 'items', 'extras_keys': 'keys', 'get_extra': 'get', 'get_extra_many': 'get_many', 'reset_extras': 'reset', 'set_extra': 'set', 'set_extra_many': 'set_many'}#
- add_nodes(nodes: Union[Node, Sequence[Node]]) None [source]#
Add a node or a set of nodes to the group.
- Note
all the nodes and the group itself have to be stored.
- Parameters
nodes – a single Node or a list of Nodes
- property base: aiida.orm.groups.GroupBase#
Return the group base namespace.
- count() int [source]#
Return the number of entities in this group.
- Returns
integer number of entities contained within the group
- property is_empty: bool#
Return whether the group is empty, i.e. it does not contain any nodes.
- Returns
True if it contains no nodes, False otherwise
- property nodes: aiida.orm.convert.ConvertIterator#
Return a generator/iterator that iterates over all nodes and returns the respective AiiDA subclasses of Node, and also allows to ask for the number of nodes in the group using len().
- remove_nodes(nodes: Union[Node, Sequence[Node]]) None [source]#
Remove a node or a set of nodes to the group.
- Note
all the nodes and the group itself have to be stored.
- Parameters
nodes – a single Node or a list of Nodes
- class aiida.orm.groups.GroupBase(group: aiida.orm.groups.Group)[source]#
Bases:
object
A namespace for group related functionality, that is not directly related to its user-facing properties.
- __dict__ = mappingproxy({'__module__': 'aiida.orm.groups', '__doc__': 'A namespace for group related functionality, that is not directly related to its user-facing properties.', '__init__': <function GroupBase.__init__>, 'extras': <functools.cached_property object>, '__dict__': <attribute '__dict__' of 'GroupBase' objects>, '__weakref__': <attribute '__weakref__' of 'GroupBase' objects>, '__annotations__': {'_group': "'Group'"}})#
- __init__(group: aiida.orm.groups.Group) None [source]#
Construct a new instance of the base namespace.
- __module__ = 'aiida.orm.groups'#
- __weakref__#
list of weak references to the object (if defined)
- property extras: aiida.orm.extras.EntityExtras#
Return the extras of this group.
- class aiida.orm.groups.GroupCollection(entity_class: Type[aiida.orm.entities.EntityType], backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Collection
[Group
]Collection of Groups
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.orm.groups'#
- __orig_bases__ = (aiida.orm.entities.Collection[ForwardRef('Group')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- static _entity_base_cls() Type[aiida.orm.groups.Group] [source]#
The allowed entity class or subclasses thereof.
- get_or_create(label: Optional[str] = None, **kwargs) Tuple[aiida.orm.groups.Group, bool] [source]#
Try to retrieve a group from the DB with the given arguments; create (and store) a new group if such a group was not present yet.
- Parameters
label – group label
- Returns
(group, created) where group is the group (new or existing, in any case already stored) and created is a boolean saying
- class aiida.orm.groups.GroupMeta(name, bases, namespace, **kwargs)[source]#
Bases:
abc.ABCMeta
Meta class for aiida.orm.groups.Group to automatically set the type_string attribute.
- __module__ = 'aiida.orm.groups'#
- class aiida.orm.groups.ImportGroup(label: Optional[str] = None, user: Optional[User] = None, description: str = '', type_string: Optional[str] = None, backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.groups.Group
Group to be used to contain all nodes from an export archive that has been imported.
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.orm.groups'#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- class aiida.orm.groups.UpfFamily(label: Optional[str] = None, user: Optional[User] = None, description: str = '', type_string: Optional[str] = None, backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.groups.Group
Group that represents a pseudo potential family containing UpfData nodes.
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.orm.groups'#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- aiida.orm.groups.load_group_class(type_string: str) Type[aiida.orm.groups.Group] [source]#
Load the sub class of Group that corresponds to the given type_string.
Note
will fall back on aiida.orm.groups.Group if type_string cannot be resolved to loadable entry point.
- Parameters
type_string – the entry point name of the Group sub class
- Returns
sub class of Group registered through an entry point
Module for orm logging abstract classes
- class aiida.orm.logs.Log(time: datetime.datetime, loggername: str, levelname: str, dbnode_id: int, message: str = '', metadata: Optional[Dict[str, Any]] = None, backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Entity
[BackendLog
]An AiiDA Log entity. Corresponds to a logged message against a particular AiiDA node.
- _CLS_COLLECTION#
alias of
aiida.orm.logs.LogCollection
- __abstractmethods__ = frozenset({})#
- __init__(time: datetime.datetime, loggername: str, levelname: str, dbnode_id: int, message: str = '', metadata: Optional[Dict[str, Any]] = None, backend: Optional[StorageBackend] = None)[source]#
Construct a new log
- Parameters
time – time
loggername – name of logger
levelname – name of log level
dbnode_id – id of database node
message – log message
metadata – metadata
backend – database backend
- __module__ = 'aiida.orm.logs'#
- __orig_bases__ = (aiida.orm.entities.Entity[ForwardRef('BackendLog')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- property dbnode_id: int#
Get the id of the object that created the log entry
- Returns
The id of the object that created the log entry
- property loggername: str#
The name of the logger that created this entry
- Returns
The entry loggername
- property metadata: Dict[str, Any]#
Get the metadata corresponding to the entry
- Returns
The entry metadata
- property time: datetime.datetime#
Get the time corresponding to the entry
- Returns
The entry timestamp
- class aiida.orm.logs.LogCollection(entity_class: Type[aiida.orm.entities.EntityType], backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Collection
[Log
]This class represents the collection of logs and can be used to create and retrieve logs.
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.orm.logs'#
- __orig_bases__ = (aiida.orm.entities.Collection[ForwardRef('Log')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- static _entity_base_cls() Type[aiida.orm.logs.Log] [source]#
The allowed entity class or subclasses thereof.
- create_entry_from_record(record: logging.LogRecord) Optional[aiida.orm.logs.Log] [source]#
Helper function to create a log entry from a record created as by the python logging library
- Parameters
record – The record created by the logging module
- Returns
A stored log instance
- delete(pk: int) None [source]#
Remove a Log entry from the collection with the given id
- Parameters
pk – id of the Log to delete
- Raises
NotExistent – if Log with ID
pk
is not found
- delete_all() None [source]#
Delete all Logs in the collection
- Raises
IntegrityError – if all Logs could not be deleted
The QueryBuilder: A class that allows you to query the AiiDA database, independent from backend.
Note that the backend implementation is enforced and handled with a composition model!
QueryBuilder()
is the frontend class that the user can use. It inherits from object and contains
backend-specific functionality. Backend specific functionality is provided by the implementation classes.
These inherit from aiida.orm.implementation.querybuilder.BackendQueryBuilder()
,
an interface classes which enforces the implementation of its defined methods.
An instance of one of the implementation classes becomes a member of the QueryBuilder()
instance
when instantiated by the user.
- class aiida.orm.querybuilder.Classifier(ormclass_type_string: str, process_type_string: Optional[str] = None)[source]#
Bases:
tuple
A classifier for an entity.
- __annotations__ = {'ormclass_type_string': ForwardRef('str'), 'process_type_string': ForwardRef('Optional[str]')}#
- __getnewargs__()#
Return self as a plain tuple. Used by copy and pickle.
- __module__ = 'aiida.orm.querybuilder'#
- static __new__(_cls, ormclass_type_string: str, process_type_string: Optional[str] = None)#
Create new instance of Classifier(ormclass_type_string, process_type_string)
- __repr__()#
Return a nicely formatted representation string
- __slots__ = ()#
- _asdict()#
Return a new dict which maps field names to their values.
- _field_defaults = {'process_type_string': None}#
- _field_types = {'ormclass_type_string': ForwardRef('str'), 'process_type_string': ForwardRef('Optional[str]')}#
- _fields = ('ormclass_type_string', 'process_type_string')#
- _fields_defaults = {}#
- classmethod _make(iterable)#
Make a new Classifier object from a sequence or iterable
- _replace(**kwds)#
Return a new Classifier object replacing specified fields with new values
- class aiida.orm.querybuilder.QueryBuilder(backend: Optional[StorageBackend] = None, *, debug: bool = False, path: Optional[Sequence[Union[str, Dict[str, Any], Type[Union[aiida.orm.entities.Entity, Process]]]]] = (), filters: Optional[Dict[str, Dict[str, Any]]] = None, project: Optional[Dict[str, Union[str, dict, Sequence[Union[str, dict]]]]] = None, limit: Optional[int] = None, offset: Optional[int] = None, order_by: Optional[Union[dict, List[dict], Tuple[dict, ...]]] = None, distinct: bool = False)[source]#
Bases:
object
The class to query the AiiDA database.
Usage:
from aiida.orm.querybuilder import QueryBuilder qb = QueryBuilder() # Querying nodes: qb.append(Node) # retrieving the results: results = qb.all()
- _EDGE_TAG_DELIM = '--'#
- _VALID_PROJECTION_KEYS = ('func', 'cast')#
- __deepcopy__(memo) QueryBuilder [source]#
Create deep copy of the instance.
- __dict__ = mappingproxy({'__module__': 'aiida.orm.querybuilder', '__doc__': '\n The class to query the AiiDA database.\n\n Usage::\n\n from aiida.orm.querybuilder import QueryBuilder\n qb = QueryBuilder()\n # Querying nodes:\n qb.append(Node)\n # retrieving the results:\n results = qb.all()\n\n ', '_EDGE_TAG_DELIM': '--', '_VALID_PROJECTION_KEYS': ('func', 'cast'), '__init__': <function QueryBuilder.__init__>, 'backend': <property object>, 'as_dict': <function QueryBuilder.as_dict>, 'queryhelp': <property object>, 'from_dict': <classmethod object>, '__repr__': <function QueryBuilder.__repr__>, '__str__': <function QueryBuilder.__str__>, '__deepcopy__': <function QueryBuilder.__deepcopy__>, 'get_used_tags': <function QueryBuilder.get_used_tags>, '_get_unique_tag': <function QueryBuilder._get_unique_tag>, 'append': <function QueryBuilder.append>, 'order_by': <function QueryBuilder.order_by>, 'add_filter': <function QueryBuilder.add_filter>, '_process_filters': <staticmethod object>, '_add_node_type_filter': <function QueryBuilder._add_node_type_filter>, '_add_process_type_filter': <function QueryBuilder._add_process_type_filter>, '_add_group_type_filter': <function QueryBuilder._add_group_type_filter>, 'add_projection': <function QueryBuilder.add_projection>, 'set_debug': <function QueryBuilder.set_debug>, 'debug': <function QueryBuilder.debug>, 'limit': <function QueryBuilder.limit>, 'offset': <function QueryBuilder.offset>, 'distinct': <function QueryBuilder.distinct>, 'inputs': <function QueryBuilder.inputs>, 'outputs': <function QueryBuilder.outputs>, 'children': <function QueryBuilder.children>, 'parents': <function QueryBuilder.parents>, 'as_sql': <function QueryBuilder.as_sql>, 'analyze_query': <function QueryBuilder.analyze_query>, '_get_aiida_entity_res': <staticmethod object>, 'first': <function QueryBuilder.first>, 'count': <function QueryBuilder.count>, 'iterall': <function QueryBuilder.iterall>, 'iterdict': <function QueryBuilder.iterdict>, 'all': <function QueryBuilder.all>, 'one': <function QueryBuilder.one>, 'dict': <function QueryBuilder.dict>, '__dict__': <attribute '__dict__' of 'QueryBuilder' objects>, '__weakref__': <attribute '__weakref__' of 'QueryBuilder' objects>, '__annotations__': {'_impl': 'BackendQueryBuilder', '_path': 'List[PathItemType]', '_filters': 'Dict[str, FilterType]', '_projections': 'Dict[str, List[Dict[str, Dict[str, Any]]]]', '_order_by': 'List[Dict[str, List[Dict[str, Dict[str, str]]]]]', '_limit': 'Optional[int]', '_offset': 'Optional[int]', '_distinct': 'bool'}})#
- __init__(backend: Optional[StorageBackend] = None, *, debug: bool = False, path: Optional[Sequence[Union[str, Dict[str, Any], Type[Union[aiida.orm.entities.Entity, Process]]]]] = (), filters: Optional[Dict[str, Dict[str, Any]]] = None, project: Optional[Dict[str, Union[str, dict, Sequence[Union[str, dict]]]]] = None, limit: Optional[int] = None, offset: Optional[int] = None, order_by: Optional[Union[dict, List[dict], Tuple[dict, ...]]] = None, distinct: bool = False) None [source]#
Instantiates a QueryBuilder instance.
Which backend is used decided here based on backend-settings (taken from the user profile). This cannot be overridden so far by the user.
- Parameters
debug – Turn on debug mode. This feature prints information on the screen about the stages of the QueryBuilder. Does not affect results.
path – A list of the vertices to traverse. Leave empty if you plan on using the method
QueryBuilder.append()
.filters – The filters to apply. You can specify the filters here, when appending to the query using
QueryBuilder.append()
or even later usingQueryBuilder.add_filter()
. Check latter gives API-details.project – The projections to apply. You can specify the projections here, when appending to the query using
QueryBuilder.append()
or even later usingQueryBuilder.add_projection()
. Latter gives you API-details.limit – Limit the number of rows to this number. Check
QueryBuilder.limit()
for more information.offset – Set an offset for the results returned. Details in
QueryBuilder.offset()
.order_by – How to order the results. As the 2 above, can be set also at later stage, check
QueryBuilder.order_by()
for more information.distinct – Whether to return de-duplicated rows
- __module__ = 'aiida.orm.querybuilder'#
- __weakref__#
list of weak references to the object (if defined)
- _add_group_type_filter(tagspec: str, classifiers: List[aiida.orm.querybuilder.Classifier], subclassing: bool) None [source]#
Add a filter based on group type.
- Parameters
tagspec – The tag, which has to exist already as a key in self._filters
classifiers – a dictionary with classifiers
subclassing – if True, allow for subclasses of the ormclass
- _add_node_type_filter(tagspec: str, classifiers: List[aiida.orm.querybuilder.Classifier], subclassing: bool)[source]#
Add a filter based on node type.
- Parameters
tagspec – The tag, which has to exist already as a key in self._filters
classifiers – a dictionary with classifiers
subclassing – if True, allow for subclasses of the ormclass
- _add_process_type_filter(tagspec: str, classifiers: List[aiida.orm.querybuilder.Classifier], subclassing: bool) None [source]#
Add a filter based on process type.
- Parameters
tagspec – The tag, which has to exist already as a key in self._filters
classifiers – a dictionary with classifiers
subclassing – if True, allow for subclasses of the process type
Note: This function handles the case when process_type_string is None.
- static _get_aiida_entity_res(value) Any [source]#
Convert a projected query result to front end class if it is an instance of a BackendEntity.
Values that are not an BackendEntity instance will be returned unaltered
- Parameters
value – a projected query result to convert
- Returns
the converted value
- _get_unique_tag(classifiers: List[aiida.orm.querybuilder.Classifier]) str [source]#
Using the function get_tag_from_type, I get a tag. I increment an index that is appended to that tag until I have an unused tag. This function is called in
QueryBuilder.append()
when no tag is given.- Parameters
classifiers (dict) –
Classifiers, containing the string that defines the type of the AiiDA ORM class. For subclasses of Node, this is the Node._plugin_type_string, for other they are as defined as returned by
QueryBuilder._get_ormclass()
.Can also be a list of dictionaries, when multiple classes are passed to QueryBuilder.append
- Returns
A tag as a string (it is a single string also when passing multiple classes).
- add_filter(tagspec: Union[str, Type[Union[aiida.orm.entities.Entity, Process]]], filter_spec: Dict[str, Any]) QueryBuilder [source]#
Adding a filter to my filters.
- Parameters
tagspec – A tag string or an ORM class which maps to an existing tag
filter_spec – The specifications for the filter, has to be a dictionary
Usage:
qb = QueryBuilder() # Instantiating the QueryBuilder instance qb.append(Node, tag='node') # Appending a Node #let's put some filters: qb.add_filter('node',{'id':{'>':12}}) # 2 filters together: qb.add_filter('node',{'label':'foo', 'uuid':{'like':'ab%'}}) # Now I am overriding the first filter I set: qb.add_filter('node',{'id':13})
- add_projection(tag_spec: Union[str, Type[Union[aiida.orm.entities.Entity, Process]]], projection_spec: Union[str, dict, Sequence[Union[str, dict]]]) None [source]#
Adds a projection
- Parameters
tag_spec – A tag string or an ORM class which maps to an existing tag
projection_spec – The specification for the projection. A projection is a list of dictionaries, with each dictionary containing key-value pairs where the key is database entity (e.g. a column / an attribute) and the value is (optional) additional information on how to process this database entity.
If the given projection_spec is not a list, it will be expanded to a list. If the listitems are not dictionaries, but strings (No additional processing of the projected results desired), they will be expanded to dictionaries.
Usage:
qb = QueryBuilder() qb.append(StructureData, tag='struc') # Will project the uuid and the kinds qb.add_projection('struc', ['uuid', 'attributes.kinds'])
The above example will project the uuid and the kinds-attribute of all matching structures. There are 2 (so far) special keys.
The single star * will project the ORM-instance:
qb = QueryBuilder() qb.append(StructureData, tag='struc') # Will project the ORM instance qb.add_projection('struc', '*') print type(qb.first()[0]) # >>> aiida.orm.nodes.data.structure.StructureData
The double star
**
projects all possible projections of this entity:QueryBuilder().append(StructureData,tag=’s’, project=’**’).limit(1).dict()[0][‘s’].keys()
# >>> ‘user_id, description, ctime, label, extras, mtime, id, attributes, dbcomputer_id, type, uuid’
Be aware that the result of
**
depends on the backend implementation.
- all(batch_size: Optional[int] = None, flat: bool = False) Union[List[List[Any]], List[Any]] [source]#
Executes the full query with the order of the rows as returned by the backend.
The order inside each row is given by the order of the vertices in the path and the order of the projections for each vertex in the path.
- Parameters
batch_size – the size of the batches to ask the backend to batch results in subcollections. You can optimize the speed of the query by tuning this parameter. Leave the default None if speed is not critical or if you don’t know what you’re doing.
flat – return the result as a flat list of projected entities without sub lists.
- Returns
a list of lists of all projected entities.
- analyze_query(execute: bool = True, verbose: bool = False) str [source]#
Return the query plan, i.e. a list of SQL statements that will be executed.
See: https://www.postgresql.org/docs/11/sql-explain.html
- Params execute
Carry out the command and show actual run times and other statistics.
- Params verbose
Display additional information regarding the plan.
- append(cls: Optional[Union[Type[Union[aiida.orm.entities.Entity, Process]], Sequence[Type[Union[aiida.orm.entities.Entity, Process]]]]] = None, entity_type: Optional[Union[str, Sequence[str]]] = None, tag: Optional[str] = None, filters: Optional[Dict[str, Any]] = None, project: Optional[Union[str, dict, Sequence[Union[str, dict]]]] = None, subclassing: bool = True, edge_tag: Optional[str] = None, edge_filters: Optional[Dict[str, Any]] = None, edge_project: Optional[Union[str, dict, Sequence[Union[str, dict]]]] = None, outerjoin: bool = False, joining_keyword: Optional[str] = None, joining_value: Optional[Any] = None, orm_base: Optional[str] = None, **kwargs: Any) QueryBuilder [source]#
Any iterative procedure to build the path for a graph query needs to invoke this method to append to the path.
- Parameters
cls –
The Aiida-class (or backend-class) defining the appended vertice. Also supports a tuple/list of classes. This results in an all instances of this class being accepted in a query. However the classes have to have the same orm-class for the joining to work. I.e. both have to subclasses of Node. Valid is:
cls=(StructureData, Dict)
This is invalid:
cls=(Group, Node)
entity_type – The node type of the class, if cls is not given. Also here, a tuple or list is accepted.
tag – A unique tag. If none is given, I will create a unique tag myself.
filters – Filters to apply for this vertex. See
add_filter()
, the method invoked in the background, or usage examples for details.project – Projections to apply. See usage examples for details. More information also in
add_projection()
.subclassing – Whether to include subclasses of the given class (default True). E.g. Specifying a ProcessNode as cls will include CalcJobNode, WorkChainNode, CalcFunctionNode, etc..
edge_tag – The tag that the edge will get. If nothing is specified (and there is a meaningful edge) the default is tag1–tag2 with tag1 being the entity joining from and tag2 being the entity joining to (this entity).
edge_filters – The filters to apply on the edge. Also here, details in
add_filter()
.edge_project – The project from the edges. API-details in
add_projection()
.outerjoin – If True, (default is False), will do a left outerjoin instead of an inner join
Joining can be specified in two ways:
Specifying the ‘joining_keyword’ and ‘joining_value’ arguments
Specify a single keyword argument
The joining keyword wil be
with_*
ordirection
, depending on the joining entity type. The joining value is the tag name or class of the entity to join to.A small usage example how this can be invoked:
qb = QueryBuilder() # Instantiating empty querybuilder instance qb.append(cls=StructureData) # First item is StructureData node # The # next node in the path is a PwCalculation, with # the structure joined as an input qb.append( cls=PwCalculation, with_incoming=StructureData )
- Returns
self
- as_dict(copy: bool = True) aiida.orm.implementation.querybuilder.QueryDictType [source]#
Convert to a JSON serialisable dictionary representation of the query.
- as_sql(inline: bool = False) str [source]#
Convert the query to an SQL string representation.
Warning
This method should be used for debugging purposes only, since normally sqlalchemy will handle this process internally.
- Params inline
Inline bound parameters (this is normally handled by the Python DB-API).
- property backend: StorageBackend#
Return the backend used by the QueryBuilder.
- children(**kwargs: Any) QueryBuilder [source]#
Join to children/descendants of previous vertice in path.
- Returns
self
- count() int [source]#
Counts the number of rows returned by the backend.
- Returns
the number of rows as an integer
- debug(msg: str, *objects: Any) None [source]#
Log debug message.
objects will passed to the format string, e.g.
msg % objects
- dict(batch_size: Optional[int] = None) List[Dict[str, Dict[str, Any]]] [source]#
Executes the full query with the order of the rows as returned by the backend. the order inside each row is given by the order of the vertices in the path and the order of the projections for each vertice in the path.
- Parameters
batch_size – The size of the batches to ask the backend to batch results in subcollections. You can optimize the speed of the query by tuning this parameter. Leave the default (None) if speed is not critical or if you don’t know what you’re doing!
- Returns
A list of dictionaries of all projected entities: tag -> field -> value
Usage:
qb = QueryBuilder() qb.append( StructureData, tag='structure', filters={'uuid':{'==':myuuid}}, ) qb.append( Node, with_ancestors='structure', project=['entity_type', 'id'], # returns entity_type (string) and id (string) tag='descendant' ) # Return the dictionaries: print "qb.iterdict()" for d in qb.iterdict(): print '>>>', d
results in the following output:
qb.iterdict() >>> {'descendant': { 'entity_type': 'calculation.job.quantumespresso.pw.PwCalculation.', 'id': 7716} } >>> {'descendant': { 'entity_type': 'data.remote.RemoteData.', 'id': 8510} }
- distinct(value: bool = True) QueryBuilder [source]#
Asks for distinct rows, which is the same as asking the backend to remove duplicates. Does not execute the query!
If you want a distinct query:
qb = QueryBuilder() # append stuff! qb.append(...) qb.append(...) ... qb.distinct().all() #or qb.distinct().dict()
- Returns
self
- first(flat: Literal[False]) Optional[list[Any]] [source]#
- first(flat: Literal[True]) Optional[Any]
Return the first result of the query.
Calling
first
results in an execution of the underlying query.Note, this may change if several rows are valid for the query, as persistent ordering is not guaranteed unless explicitly specified.
- Parameters
flat – if True, return just the projected quantity if there is just a single projection.
- Returns
One row of results as a list, or None if no result returned.
- classmethod from_dict(dct: Dict[str, Any]) QueryBuilder [source]#
Create an instance from a dictionary representation of the query.
- get_used_tags(vertices: bool = True, edges: bool = True) List[str] [source]#
Returns a list of all the vertices that are being used.
- Parameters
vertices – If True, adds the tags of vertices to the returned list
edges – If True, adds the tags of edges to the returnend list.
- Returns
A list of tags
- inputs(**kwargs: Any) QueryBuilder [source]#
Join to inputs of previous vertice in path.
- Returns
self
- iterall(batch_size: Optional[int] = 100) Iterable[List[Any]] [source]#
Same as
all()
, but returns a generator. Be aware that this is only safe if no commit will take place during this transaction. You might also want to read the SQLAlchemy documentation on https://docs.sqlalchemy.org/en/14/orm/query.html#sqlalchemy.orm.Query.yield_per- Parameters
batch_size – The size of the batches to ask the backend to batch results in subcollections. You can optimize the speed of the query by tuning this parameter.
- Returns
a generator of lists
- iterdict(batch_size: Optional[int] = 100) Iterable[Dict[str, Dict[str, Any]]] [source]#
Same as
dict()
, but returns a generator. Be aware that this is only safe if no commit will take place during this transaction. You might also want to read the SQLAlchemy documentation on https://docs.sqlalchemy.org/en/14/orm/query.html#sqlalchemy.orm.Query.yield_per- Parameters
batch_size – The size of the batches to ask the backend to batch results in subcollections. You can optimize the speed of the query by tuning this parameter.
- Returns
a generator of dictionaries
- limit(limit: Optional[int]) QueryBuilder [source]#
Set the limit (nr of rows to return)
- Parameters
limit – integers of number of rows of rows to return
- offset(offset: Optional[int]) QueryBuilder [source]#
Set the offset. If offset is set, that many rows are skipped before returning. offset = 0 is the same as omitting setting the offset. If both offset and limit appear, then offset rows are skipped before starting to count the limit rows that are returned.
- Parameters
offset – integers of nr of rows to skip
- one() List[Any] [source]#
Executes the query asking for exactly one results.
Will raise an exception if this is not the case:
- Raises
MultipleObjectsError if more then one row can be returned
- Raises
NotExistent if no result was found
- order_by(order_by: Union[dict, List[dict], Tuple[dict, ...]]) QueryBuilder [source]#
Set the entity to order by
- Parameters
order_by – This is a list of items, where each item is a dictionary specifies what to sort for an entity
In each dictionary in that list, keys represent valid tags of entities (tables), and values are list of columns.
Usage:
#Sorting by id (ascending): qb = QueryBuilder() qb.append(Node, tag='node') qb.order_by({'node':['id']}) # or #Sorting by id (ascending): qb = QueryBuilder() qb.append(Node, tag='node') qb.order_by({'node':[{'id':{'order':'asc'}}]}) # for descending order: qb = QueryBuilder() qb.append(Node, tag='node') qb.order_by({'node':[{'id':{'order':'desc'}}]}) # or (shorter) qb = QueryBuilder() qb.append(Node, tag='node') qb.order_by({'node':[{'id':'desc'}]})
- outputs(**kwargs: Any) QueryBuilder [source]#
Join to outputs of previous vertice in path.
- Returns
self
- parents(**kwargs: Any) QueryBuilder [source]#
Join to parents/ancestors of previous vertice in path.
- Returns
self
- property queryhelp: QueryDictType#
“Legacy name for
as_dict
method.
- set_debug(debug: bool) QueryBuilder [source]#
Run in debug mode. This does not affect functionality, but prints intermediate stages when creating a query on screen.
- Parameters
debug – Turn debug on or off
- class aiida.orm.querybuilder._QueryTagMap[source]#
Bases:
object
Cache of tag mappings for a query.
- __dict__ = mappingproxy({'__module__': 'aiida.orm.querybuilder', '__doc__': 'Cache of tag mappings for a query.', '__init__': <function _QueryTagMap.__init__>, '__repr__': <function _QueryTagMap.__repr__>, '__contains__': <function _QueryTagMap.__contains__>, '__iter__': <function _QueryTagMap.__iter__>, 'add': <function _QueryTagMap.add>, 'remove': <function _QueryTagMap.remove>, 'get': <function _QueryTagMap.get>, '__dict__': <attribute '__dict__' of '_QueryTagMap' objects>, '__weakref__': <attribute '__weakref__' of '_QueryTagMap' objects>, '__annotations__': {'_tag_to_type': 'Dict[str, Union[None, EntityTypes]]', '_cls_to_tag_map': 'Dict[Any, Set[str]]'}})#
- __module__ = 'aiida.orm.querybuilder'#
- __weakref__#
list of weak references to the object (if defined)
- add(tag: str, etype: Union[None, aiida.orm.entities.EntityTypes] = None, klasses: Optional[Union[Type[Union[aiida.orm.entities.Entity, Process]], Sequence[Type[Union[aiida.orm.entities.Entity, Process]]]]] = None) None [source]#
Add a tag.
- get(tag_or_cls: Union[str, Type[Union[aiida.orm.entities.Entity, Process]]]) str [source]#
Return the tag or, given a class(es), map to a tag.
- Raises
ValueError – if the tag is not found, or the class(es) does not map to a single tag
- aiida.orm.querybuilder._get_group_type_filter(classifiers: aiida.orm.querybuilder.Classifier, subclassing: bool) dict [source]#
Return filter dictionaries for Group.type_string given a set of classifiers.
- Parameters
classifiers – a dictionary with classifiers (note: does not support lists)
subclassing – if True, allow for subclasses of the ormclass
- Returns
dictionary in QueryBuilder filter language to pass into {‘type_string’: … }
- aiida.orm.querybuilder._get_node_type_filter(classifiers: aiida.orm.querybuilder.Classifier, subclassing: bool) dict [source]#
Return filter dictionaries given a set of classifiers.
- Parameters
classifiers – a dictionary with classifiers (note: does not support lists)
subclassing – if True, allow for subclasses of the ormclass
- Returns
dictionary in QueryBuilder filter language to pass into {“type”: … }
- aiida.orm.querybuilder._get_ormclass(cls: Union[None, Type[Union[aiida.orm.entities.Entity, Process]], Sequence[Type[Union[aiida.orm.entities.Entity, Process]]]], entity_type: Union[None, str, Sequence[str]]) Tuple[aiida.orm.entities.EntityTypes, List[aiida.orm.querybuilder.Classifier]] [source]#
Get ORM classifiers from either class(es) or ormclass_type_string(s).
- Parameters
cls – a class or tuple/set/list of classes that are either AiiDA ORM classes or backend ORM classes.
ormclass_type_string – type string for ORM class
- Returns
the ORM class as well as a dictionary with additional classifier strings
Handles the case of lists as well.
- aiida.orm.querybuilder._get_ormclass_from_cls(cls: Type[Union[aiida.orm.entities.Entity, Process]]) Tuple[aiida.orm.entities.EntityTypes, aiida.orm.querybuilder.Classifier] [source]#
Return the correct classifiers for the QueryBuilder from an ORM class.
- Parameters
cls – an AiiDA ORM class or backend ORM class.
query – an instance of the appropriate QueryBuilder backend.
- Returns
the ORM class as well as a dictionary with additional classifier strings
- Note: the ormclass_type_string is currently hardcoded for group, computer etc. One could instead use something like
aiida.orm.utils.node.get_type_string_from_class(cls.__module__, cls.__name__)
- aiida.orm.querybuilder._get_ormclass_from_str(type_string: str) Tuple[aiida.orm.entities.EntityTypes, aiida.orm.querybuilder.Classifier] [source]#
Return the correct classifiers for the QueryBuilder from an ORM type string.
- Parameters
type_string – type string for ORM class
query – an instance of the appropriate QueryBuilder backend.
- Returns
the ORM class as well as a dictionary with additional classifier strings
- aiida.orm.querybuilder._get_process_type_filter(classifiers: aiida.orm.querybuilder.Classifier, subclassing: bool) dict [source]#
Return filter dictionaries given a set of classifiers.
- Parameters
classifiers – a dictionary with classifiers (note: does not support lists)
subclassing – if True, allow for subclasses of the process type This is activated only, if an entry point can be found for the process type (as well as for a selection of built-in process types)
- Returns
dictionary in QueryBuilder filter language to pass into {“process_type”: … }
Module for the ORM user class.
- class aiida.orm.users.User(email: str, first_name: str = '', last_name: str = '', institution: str = '', backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Entity
[BackendUser
]AiiDA User
- _CLS_COLLECTION#
alias of
aiida.orm.users.UserCollection
- __abstractmethods__ = frozenset({})#
- __init__(email: str, first_name: str = '', last_name: str = '', institution: str = '', backend: Optional[StorageBackend] = None)[source]#
Create a new User.
- __module__ = 'aiida.orm.users'#
- __orig_bases__ = (aiida.orm.entities.Entity[ForwardRef('BackendUser')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- get_short_name() str [source]#
Return the user short name (typically, this returns the email)
- Returns
The short name
- class aiida.orm.users.UserCollection(entity_class: Type[aiida.orm.entities.EntityType], backend: Optional[StorageBackend] = None)[source]#
Bases:
aiida.orm.entities.Collection
[User
]The collection of users stored in a backend.
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.orm.users'#
- __orig_bases__ = (aiida.orm.entities.Collection[ForwardRef('User')],)#
- __parameters__ = ()#
- _abc_impl = <_abc_data object>#
- static _entity_base_cls() Type[aiida.orm.users.User] [source]#
The allowed entity class or subclasses thereof.
- get_default() Optional[aiida.orm.users.User] [source]#
Get the current default user
- get_or_create(email: str, **kwargs) Tuple[bool, aiida.orm.users.User] [source]#
Get the existing user with a given email address or create an unstored one
- Parameters
kwargs – The properties of the user to get or create
- Returns
The corresponding user object
- Raises
aiida.common.exceptions.MultipleObjectsError
,aiida.common.exceptions.NotExistent