aiida.common package#

Common data structures, utility classes and functions

Note

Modules in this sub package have to run without a loaded database environment

Submodules#

Module to define the (physical) constants used throughout the code.

Module to define commonly used data structures.

class aiida.common.datastructures.CalcInfo(dictionary=None)[source]#

Bases: DefaultFieldsAttributeDict

This object will store the data returned by the calculation plugin and to be passed to the ExecManager.

In the following descriptions all paths have to be considered relative

  • retrieve_list: a list of strings or tuples that indicate files that are to be retrieved from the remote after the

    calculation has finished and stored in the retrieved_folder output node of type FolderData. If the entry in the list is just a string, it is assumed to be the filepath on the remote and it will be copied to the base directory of the retrieved folder, where the name corresponds to the basename of the remote relative path. This means that any remote folder hierarchy is ignored entirely.

    Remote folder hierarchy can be (partially) maintained by using a tuple instead, with the following format

    (source, target, depth)

    The source and target elements are relative filepaths in the remote and retrieved folder. The contents of source (whether it is a file or folder) are copied in its entirety to the target subdirectory in the retrieved folder. If no subdirectory should be created, '.' should be specified for target.

    The source filepaths support glob patterns * in case the exact name of the files that are to be retrieved are not know a priori.

    The depth element can be used to control what level of nesting of the source folder hierarchy should be maintained. If depth equals 0 or 1 (they are equivalent), only the basename of the source filepath is kept. For each additional level, another subdirectory of the remote hierarchy is kept. For example:

    (‘path/sub/file.txt’, ‘.’, 2)

    will retrieve the file.txt and store it under the path:

    sub/file.txt

  • retrieve_temporary_list: a list of strings or tuples that indicate files that will be retrieved

    and stored temporarily in a FolderData, that will be available only during the parsing call. The format of the list is the same as that of ‘retrieve_list’

  • local_copy_list: a list of tuples with format (‘node_uuid’, ‘filename’, relativedestpath’)

  • remote_copy_list: a list of tuples with format (‘remotemachinename’, ‘remoteabspath’, ‘relativedestpath’)

  • remote_symlink_list: a list of tuples with format (‘remotemachinename’, ‘remoteabspath’, ‘relativedestpath’)

  • provenance_exclude_list: a sequence of relative paths of files in the sandbox folder of a CalcJob instance that

    should not be stored permanantly in the repository folder of the corresponding CalcJobNode that will be created, but should only be copied to the remote working directory on the target computer. This is useful for input files that should be copied to the working directory but should not be copied as well to the repository either, for example, because they contain proprietary information or because they are big and their content is already indirectly present in the repository through one of the data nodes passed as input to the calculation.

  • codes_info: a list of dictionaries used to pass the info of the execution of a code

  • codes_run_mode: the mode of execution in which the codes will be run (CodeRunMode.SERIAL by default,

    but can also be CodeRunMode.PARALLEL)

  • skip_submit: a flag that, when set to True, orders the engine to skip the submit/update steps (so no code will

    run, it will only upload the files and then retrieve/parse).

  • file_copy_operation_order: Order in which input files are copied to the working directory. Should be a list of aiida.common.datastructures.FileCopyOperation instances.

__annotations__ = {}#
__module__ = 'aiida.common.datastructures'#
_default_fields = ('job_environment', 'email', 'email_on_started', 'email_on_terminated', 'uuid', 'prepend_text', 'append_text', 'num_machines', 'num_mpiprocs_per_machine', 'priority', 'max_wallclock_seconds', 'max_memory_kb', 'rerunnable', 'retrieve_list', 'retrieve_temporary_list', 'local_copy_list', 'remote_copy_list', 'remote_symlink_list', 'provenance_exclude_list', 'codes_info', 'codes_run_mode', 'skip_submit', 'file_copy_operation_order')#
class aiida.common.datastructures.CalcJobState(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]#

Bases: Enum

The sub state of a CalcJobNode while its Process is in an active state (i.e. Running or Waiting).

PARSING = 'parsing'#
RETRIEVING = 'retrieving'#
STASHING = 'stashing'#
SUBMITTING = 'submitting'#
UPLOADING = 'uploading'#
WITHSCHEDULER = 'withscheduler'#
__module__ = 'aiida.common.datastructures'#
class aiida.common.datastructures.CodeInfo(dictionary=None)[source]#

Bases: DefaultFieldsAttributeDict

This attribute-dictionary contains the information needed to execute a code. Possible attributes are:

  • cmdline_params: a list of strings, containing parameters to be written on the command line right after the call to the code, as for example:

    code.x cmdline_params[0] cmdline_params[1] ... < stdin > stdout
    
  • stdin_name: (optional) the name of the standard input file. Note, it is only possible to use the stdin with the syntax:

    code.x < stdin_name
    

    If no stdin_name is specified, the string “< stdin_name” will not be passed to the code. Note: it is not possible to substitute/remove the ‘<’ if stdin_name is specified; if that is needed, avoid stdin_name and use instead the cmdline_params to specify a suitable syntax.

  • stdout_name: (optional) the name of the standard output file. Note, it is only possible to pass output to stdout_name with the syntax:

    code.x ... > stdout_name
    

    If no stdout_name is specified, the string “> stdout_name” will not be passed to the code. Note: it is not possible to substitute/remove the ‘>’ if stdout_name is specified; if that is needed, avoid stdout_name and use instead the cmdline_params to specify a suitable syntax.

  • stderr_name: (optional) a string, the name of the error file of the code.

  • join_files: (optional) if True, redirects the error to the output file. If join_files=True, the code will be called as:

    code.x ... > stdout_name 2>&1
    

    otherwise, if join_files=False and stderr is passed:

    code.x ... > stdout_name 2> stderr_name
    
  • withmpi: if True, executes the code with mpirun (or another MPI installed on the remote computer)

  • code_uuid: the uuid of the code associated to the CodeInfo

__annotations__ = {}#
__module__ = 'aiida.common.datastructures'#
_default_fields = ('cmdline_params', 'stdin_name', 'stdout_name', 'stderr_name', 'join_files', 'withmpi', 'code_uuid')#
class aiida.common.datastructures.CodeRunMode(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]#

Bases: IntEnum

Enum to indicate the way the codes of a calculation should be run.

For PARALLEL, the codes for a given calculation will be run in parallel by running them in the background:

code1.x &
code2.x &

For the SERIAL option, codes will be executed sequentially by running for example the following:

code1.x
code2.x
PARALLEL = 1#
SERIAL = 0#
__format__(format_spec, /)#

Default object formatter.

__module__ = 'aiida.common.datastructures'#
__new__(value)#
class aiida.common.datastructures.FileCopyOperation(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]#

Bases: IntEnum

Enum to represent the copy operations that are used when creating the working directory of a CalcJob.

There are three different sources of files that are copied to the working directory on the remote computer where a calculation job is executed:

  • Local: files written to the temporary sandbox folder by the engine based on the local_copy_list defined by the plugin in the prepare_for_submission method.

  • Remote: files written directly to the remote working directory by the engine base on the remote_copy_list defined by the plugin in the prepare_for_submission method.

  • Sandbox: files written to a temporary sandbox folder on the local file system written by the CalcJob plugin, first in the prepare_for_submission method, followed by the presubmit of the base class.

Historically, these operations were performed in the order of sandbox, local and remote. For certain use cases, however, this was deemed non-ideal, for example because files from the remote would override files written by the plugin itself in the sandbox. The CalcInfo.file_copy_operation_order attribute can be used to specify a list of this enum to indicate the desired order for file copy operations.

LOCAL = 0#
REMOTE = 1#
SANDBOX = 2#
__format__(format_spec, /)#

Default object formatter.

__module__ = 'aiida.common.datastructures'#
__new__(value)#
class aiida.common.datastructures.StashMode(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]#

Bases: Enum

Mode to use when stashing files from the working directory of a completed calculation job for safekeeping.

COPY = 'copy'#
__module__ = 'aiida.common.datastructures'#

Miscellaneous functions for escaping strings.

aiida.common.escaping.escape_for_bash(str_to_escape, use_double_quotes=False)[source]#

This function takes any string and escapes it in a way that bash will interpret it as a single string.

Explanation:

At the end, in the return statement, the string is put within single quotes. Therefore, the only thing that I have to escape in bash is the single quote character. To do this, I substitute every single quote ‘ with ‘”’”’ which means:

First single quote: exit from the enclosing single quotes

Second, third and fourth character: “’” is a single quote character, escaped by double quotes

Last single quote: reopen the single quote to continue the string

Finally, note that for python I have to enclose the string ‘”’”’ within triple quotes to make it work, getting finally: the complicated string found below.

Parameters:
  • str_to_escape – the string to escape.

  • use_double_quotes – boolean, if True, use double quotes instead of single quotes.

Returns:

the escaped string.

aiida.common.escaping.escape_for_sql_like(string)[source]#

Function that escapes % or _ symbols provided by user

SQL LIKE syntax summary:

  • % -> match any number of characters

  • _ -> match exactly one character

aiida.common.escaping.get_regex_pattern_from_sql(sql_pattern)[source]#

Convert a string providing a pattern to match in SQL syntax into a string performing the same match as a regex.

SQL LIKE syntax summary:

  • % -> match any number of characters

  • _ -> match exactly one character

Moreover, \ is the escape character (by default), so:

  • \\ -> single backslash

  • \% -> literal % symbol

  • \_ -> literal _ symbol

and moreover the string should begin at the beginning of the line and end at the end of the line.

Parameters:

sql_pattern – the string with the pattern in SQL syntax

Returns:

a string with the pattern in regex syntax

aiida.common.escaping.sql_string_match(string, pattern)[source]#

Check if the string matches the provided pattern, specified using SQL syntax.

See documentation of get_regex_pattern_from_sql() for an explanation of the syntax.

Parameters:
  • string – the string to check

  • pattern – the SQL pattern

Returns:

True if the string matches, False otherwise

Module that define the exceptions that are thrown by AiiDA’s internal code.

exception aiida.common.exceptions.AiidaException[source]#

Bases: Exception

Base class for all AiiDA exceptions.

Each module will have its own subclass, inherited from this (e.g. ExecManagerException, TransportException, …)

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
__weakref__#

list of weak references to the object (if defined)

exception aiida.common.exceptions.CircusCallError[source]#

Bases: AiidaException

Raised when an attempt to contact Circus returns an error in the response

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.ClosedStorage[source]#

Bases: AiidaException

Raised when trying to access data from a closed storage backend.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.ConfigurationError[source]#

Bases: AiidaException

Error raised when there is a configuration error in AiiDA.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.ConfigurationVersionError[source]#

Bases: ConfigurationError

Configuration error raised when the configuration file version is not compatible with the current version.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.ContentNotExistent[source]#

Bases: NotExistent

Raised when trying to access an attribute, a key or a file in the result nodes that is not present

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.CorruptStorage[source]#

Bases: ConfigurationError

Raised when the storage is not found to be internally consistent on validation.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.DatabaseMigrationError[source]#

Bases: AiidaException

Raised if a critical error is encountered during a storage migration.

Deprecated for StorageMigrationError

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.DbContentError[source]#

Bases: AiidaException

Raised when the content of the DB is not valid. This should never happen if the user does not play directly with the DB.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.EntryPointError[source]#

Bases: AiidaException

Raised when an entry point cannot be uniquely resolved and imported.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.FailedError[source]#

Bases: AiidaException

Raised when accessing a calculation that is in the FAILED status

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.FeatureDisabled[source]#

Bases: AiidaException

Raised when a feature is requested, but the user has chosen to disable it (e.g., for submissions on disabled computers).

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.FeatureNotAvailable[source]#

Bases: AiidaException

Raised when a feature is requested from a plugin, that is not available.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.HashingError[source]#

Bases: AiidaException

Raised when an attempt to hash an object fails via a known failure mode

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.IncompatibleDatabaseSchema[source]#

Bases: ConfigurationError

Raised when the storage schema is incompatible with that of the code.

Deprecated for IncompatibleStorageSchema

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.IncompatibleStorageSchema[source]#

Bases: IncompatibleDatabaseSchema

Raised when the storage schema is incompatible with that of the code.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.InputValidationError[source]#

Bases: ValidationError

The input data for a calculation did not validate (e.g., missing required input data, wrong data, …)

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.IntegrityError[source]#

Bases: AiidaException

Raised when there is an underlying data integrity error. This can be database related or a general data integrity error. This can happen if, e.g., a foreign key check fails. See PEP 249 for details.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.InternalError[source]#

Bases: AiidaException

Error raised when there is an internal error of AiiDA.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.InvalidEntryPointTypeError[source]#

Bases: EntryPointError

Raised when a loaded entry point has a type that is not supported by the corresponding entry point group.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.InvalidOperation[source]#

Bases: AiidaException

The allowed operation is not valid (e.g., when trying to add a non-internal attribute before saving the entry), or deleting an entry that is protected (e.g., because it is referenced by foreign keys)

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.LicensingException[source]#

Bases: AiidaException

Raised when requirements for data licensing are not met.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.LoadingEntryPointError[source]#

Bases: EntryPointError

Raised when the resource corresponding to requested entry point cannot be imported.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.LockedProfileError[source]#

Bases: AiidaException

Raised if attempting to access a locked profile

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.LockingProfileError[source]#

Bases: AiidaException

Raised if the profile can`t be locked

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.MissingConfigurationError[source]#

Bases: ConfigurationError

Configuration error raised when the configuration file is missing.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.MissingEntryPointError[source]#

Bases: EntryPointError

Raised when the requested entry point is not registered with the entry point manager.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.ModificationNotAllowed[source]#

Bases: AiidaException

Raised when the user tries to modify a field, object, property, … that should not be modified.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.MultipleEntryPointError[source]#

Bases: EntryPointError

Raised when the requested entry point cannot uniquely be resolved by the entry point manager.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.MultipleObjectsError[source]#

Bases: AiidaException

Raised when more than one entity is found in the DB, but only one was expected.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.NotExistent[source]#

Bases: AiidaException

Raised when the required entity does not exist.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.NotExistentAttributeError[source]#

Bases: AttributeError, NotExistent

Raised when the required entity does not exist, when fetched as an attribute.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
__weakref__#

list of weak references to the object (if defined)

exception aiida.common.exceptions.NotExistentKeyError[source]#

Bases: KeyError, NotExistent

Raised when the required entity does not exist, when fetched as a dictionary key.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
__weakref__#

list of weak references to the object (if defined)

exception aiida.common.exceptions.OutputParsingError[source]#

Bases: ParsingError

Can be raised by a Parser when it fails to parse the output generated by a CalcJob process.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.ParsingError[source]#

Bases: AiidaException

Generic error raised when there is a parsing error

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.PluginInternalError[source]#

Bases: InternalError

Error raised when there is an internal error which is due to a plugin and not to the AiiDA infrastructure.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.ProfileConfigurationError[source]#

Bases: ConfigurationError

Configuration error raised when a wrong/inexistent profile is requested.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.RemoteOperationError[source]#

Bases: AiidaException

Raised when an error in a remote operation occurs, as in a failed kill() of a scheduler job.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.StorageBackupError[source]#

Bases: AiidaException

Raised if a critical error is encountered during a storage backup.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.StorageMigrationError[source]#

Bases: DatabaseMigrationError

Raised if a critical error is encountered during a storage migration.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.StoringNotAllowed[source]#

Bases: AiidaException

Raised when the user tries to store an unstorable node (e.g. a base Node class)

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.TestsNotAllowedError[source]#

Bases: AiidaException

Raised when tests are required to be run/loaded, but we are not in a testing environment.

This is to prevent data loss.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.TransportTaskException[source]#

Bases: AiidaException

Raised when a TransportTask, an task to be completed by the engine that requires transport, fails

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.UniquenessError[source]#

Bases: AiidaException

Raised when the user tries to violate a uniqueness constraint (on the DB, for instance).

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.UnreachableStorage[source]#

Bases: ConfigurationError

Raised when a connection to the storage backend fails.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#
exception aiida.common.exceptions.UnsupportedSpeciesError[source]#

Bases: ValueError

Raised when StructureData operations are fed species that are not supported by AiiDA such as Deuterium

__module__ = 'aiida.common.exceptions'#
__weakref__#

list of weak references to the object (if defined)

exception aiida.common.exceptions.ValidationError[source]#

Bases: AiidaException

Error raised when there is an error during the validation phase of a property.

__annotations__ = {}#
__module__ = 'aiida.common.exceptions'#

Various dictionary types with extended functionality.

class aiida.common.extendeddicts.AttributeDict(dictionary=None)[source]#

Bases: dict

This class internally stores values in a dictionary, but exposes the keys also as attributes, i.e. asking for attrdict.key will return the value of attrdict[‘key’] and so on.

Raises an AttributeError if the key does not exist, when called as an attribute, while the usual KeyError if the key does not exist and the dictionary syntax is used.

__annotations__ = {}#
__deepcopy__(memo=None)[source]#

Deep copy.

__delattr__(attr)[source]#

Delete a key as an attribute.

Raises:

AttributeError – if the attribute does not correspond to an existing key.

__dict__ = mappingproxy({'__module__': 'aiida.common.extendeddicts', '__doc__': "This class internally stores values in a dictionary, but exposes\n    the keys also as attributes, i.e. asking for attrdict.key\n    will return the value of attrdict['key'] and so on.\n\n    Raises an AttributeError if the key does not exist, when called as an attribute,\n    while the usual KeyError if the key does not exist and the dictionary syntax is\n    used.\n    ", '__init__': <function AttributeDict.__init__>, '__repr__': <function AttributeDict.__repr__>, '__getattr__': <function AttributeDict.__getattr__>, '__setattr__': <function AttributeDict.__setattr__>, '__delattr__': <function AttributeDict.__delattr__>, '__deepcopy__': <function AttributeDict.__deepcopy__>, '__getstate__': <function AttributeDict.__getstate__>, '__setstate__': <function AttributeDict.__setstate__>, '__dir__': <function AttributeDict.__dir__>, '__dict__': <attribute '__dict__' of 'AttributeDict' objects>, '__weakref__': <attribute '__weakref__' of 'AttributeDict' objects>, '__annotations__': {}})#
__dir__()[source]#

Default dir() implementation.

__getattr__(attr)[source]#

Read a key as an attribute.

Raises:

AttributeError – if the attribute does not correspond to an existing key.

__getstate__()[source]#

Needed for pickling this class.

__init__(dictionary=None)[source]#

Recursively turn the dict and all its nested dictionaries into AttributeDict instance.

__module__ = 'aiida.common.extendeddicts'#
__repr__()[source]#

Representation of the object.

__setattr__(attr, value)[source]#

Set a key as an attribute.

__setstate__(dictionary)[source]#

Needed for pickling this class.

__weakref__#

list of weak references to the object (if defined)

class aiida.common.extendeddicts.DefaultFieldsAttributeDict(dictionary=None)[source]#

Bases: AttributeDict

A dictionary with access to the keys as attributes, and with an internal value storing the ‘default’ keys to be distinguished from extra fields.

Extra methods defaultkeys() and extrakeys() divide the set returned by keys() in default keys (i.e. those defined at definition time) and other keys. There is also a method get_default_fields() to return the internal list.

Moreover, for undefined default keys, it returns None instead of raising a KeyError/AttributeError exception.

Remember to define the _default_fields in a subclass! E.g.:

class TestExample(DefaultFieldsAttributeDict):
    _default_fields = ('a','b','c')

When the validate() method is called, it calls in turn all validate_KEY methods, where KEY is one of the default keys. If the method is not present, the field is considered to be always valid. Each validate_KEY method should accept a single argument ‘value’ that will contain the value to be checked.

It raises a ValidationError if any of the validate_KEY function raises an exception, otherwise it simply returns. NOTE: the validate_* functions are called also for unset fields, so if the field can be empty on validation, you have to start your validation function with something similar to:

if value is None:
    return
__annotations__ = {}#
__getitem__(key)[source]#

Return None instead of raising an exception if the key does not exist but is in the list of default fields.

__module__ = 'aiida.common.extendeddicts'#
__setattr__(attr, value)[source]#

Overridden to allow direct access to fields with underscore.

_default_fields = ()#
defaultkeys()[source]#

Return the default keys defined in the instance.

extrakeys()[source]#

Return the extra keys defined in the instance.

classmethod get_default_fields()[source]#

Return the list of default fields, either defined in the instance or not.

validate()[source]#

Validate the keys, if any validate_* method is available.

class aiida.common.extendeddicts.FixedFieldsAttributeDict(init=None)[source]#

Bases: AttributeDict

A dictionary with access to the keys as attributes, and with filtering of valid attributes. This is only the base class, without valid attributes; use a derived class to do the actual work. E.g.:

class TestExample(FixedFieldsAttributeDict):
    _valid_fields = ('a','b','c')
__annotations__ = {}#
__dir__()[source]#

Default dir() implementation.

__init__(init=None)[source]#

Recursively turn the dict and all its nested dictionaries into AttributeDict instance.

__module__ = 'aiida.common.extendeddicts'#
__setattr__(attr, value)[source]#

Overridden to allow direct access to fields with underscore.

__setitem__(item, value)[source]#

Set a key as an attribute.

_valid_fields = ()#
classmethod get_valid_fields()[source]#

Return the list of valid fields.

Utility functions to operate on filesystem files.

aiida.common.files.md5_file(filepath, block_size_factor=128)[source]#

Create the hexdigested md5 checksum of the contents from

Parameters:
  • filepath – the filepath of the file for which we want the md5sum

  • block_size_factor – the file is read at chunks of size block_size_factor * md5.block_size, where md5.block_size is the block_size used internally by the hashlib module.

Returns:

a string with the hexdigest md5.

Raises:

No checks are done on the file, so if it doesn’t exists it may raise OSError.

aiida.common.files.md5_from_filelike(filelike, block_size_factor=128)[source]#

Create the hexdigested md5 checksum of the contents from a filelike object.

Parameters:
  • filelike – the filelike object for whose contents to generate the md5 checksum

  • block_size_factor – the file is read at chunks of size block_size_factor * md5.block_size, where md5.block_size is the block_size used internally by the hashlib module.

Returns:

a string with the hexdigest md5.

Raises:

no checks are done on the filelike object, so it may raise OSError if it cannot be read from.

aiida.common.files.sha1_file(filename, block_size_factor=128)[source]#

Open a file and return its sha1sum (hexdigested).

Parameters:
  • filename – the filename of the file for which we want the sha1sum

  • block_size_factor – the file is read at chunks of size block_size_factor * sha1.block_size, where sha1.block_size is the block_size used internally by the hashlib module.

Returns:

a string with the hexdigest sha1.

Raises:

No checks are done on the file, so if it doesn’t exists it may raise OSError.

Utility functions to operate on filesystem folders.

class aiida.common.folders.Folder(abspath, folder_limit=None)[source]#

Bases: object

A class to manage generic folders, avoiding to get out of specific given folder borders.

__dict__ = mappingproxy({'__module__': 'aiida.common.folders', '__doc__': 'A class to manage generic folders, avoiding to get out of\n    specific given folder borders.\n\n    .. todo::\n        fix this, os.path.commonprefix of /a/b/c and /a/b2/c will give\n        a/b, check if this is wanted or if we want to put trailing slashes.\n        (or if we want to use os.path.relpath and check for a string starting\n        with os.pardir?)\n\n    .. todo::\n        rethink whether the folder_limit option is still useful. If not, remove\n        it alltogether (it was a nice feature, but unfortunately all the calls\n        to os.path.abspath or normpath are quite slow).\n    ', '__init__': <function Folder.__init__>, 'mode_dir': <property object>, 'mode_file': <property object>, 'get_subfolder': <function Folder.get_subfolder>, 'get_content_list': <function Folder.get_content_list>, 'create_symlink': <function Folder.create_symlink>, 'insert_path': <function Folder.insert_path>, 'create_file_from_filelike': <function Folder.create_file_from_filelike>, 'remove_path': <function Folder.remove_path>, 'get_abs_path': <function Folder.get_abs_path>, 'open': <function Folder.open>, 'abspath': <property object>, 'folder_limit': <property object>, 'exists': <function Folder.exists>, 'isfile': <function Folder.isfile>, 'isdir': <function Folder.isdir>, 'erase': <function Folder.erase>, 'create': <function Folder.create>, 'replace_with_folder': <function Folder.replace_with_folder>, '__dict__': <attribute '__dict__' of 'Folder' objects>, '__weakref__': <attribute '__weakref__' of 'Folder' objects>, '__annotations__': {}})#
__init__(abspath, folder_limit=None)[source]#

Construct a new instance.

__module__ = 'aiida.common.folders'#
__weakref__#

list of weak references to the object (if defined)

property abspath#

The absolute path of the folder.

create()[source]#

Creates the folder, if it does not exist on the disk yet.

It will also create top directories, if absent.

It is always safe to call it, it will do nothing if the folder already exists.

create_file_from_filelike(filelike, filename, mode='wb', encoding=None)[source]#

Create a file with the given filename from a filelike object.

Parameters:
  • filelike – a filelike object whose contents to copy

  • filename – the filename for the file that is to be created

  • mode – the mode with which the target file will be written

  • encoding – the encoding with which the target file will be written

Returns:

the absolute filepath of the created file

Create a symlink inside the folder to the location ‘src’.

Parameters:
  • src – the location to which the symlink must point. Can be either a relative or an absolute path. Should, however, be relative to work properly also when the repository is moved!

  • name – the filename of the symlink to be created.

erase(create_empty_folder=False)[source]#

Erases the folder. Should be called only in very specific cases, in general folder should not be erased!

Doesn’t complain if the folder does not exist.

Parameters:

create_empty_folder – if True, after erasing, creates an empty dir.

exists()[source]#

Return True if the folder exists, False otherwise.

property folder_limit#

The folder limit that cannot be crossed when creating files and folders.

get_abs_path(relpath, check_existence=False)[source]#

Return an absolute path for a file or folder in this folder.

The advantage of using this method is that it checks that filename is a valid filename within this folder, and not something e.g. containing slashes.

Parameters:
  • filename – The file or directory.

  • check_existence – if False, just return the file path. Otherwise, also check if the file or directory actually exists. Raise OSError if it does not.

get_content_list(pattern='*', only_paths=True)[source]#

Return a list of files (and subfolders) in the folder, matching a given pattern.

Example: If you want to exclude files starting with a dot, you can call this method with pattern='[!.]*'

Parameters:
  • pattern – a pattern for the file/folder names, using Unix filename pattern matching (see Python standard module fnmatch). By default, pattern is ‘*’, matching all files and folders.

  • only_paths – if False (default), return pairs (name, is_file). if True, return only a flat list.

Returns:

a list of tuples of two elements, the first is the file name and the second is True if the element is a file, False if it is a directory.

get_subfolder(subfolder, create=False, reset_limit=False)[source]#

Return a Folder object pointing to a subfolder.

Parameters:
  • subfolder – a string with the relative path of the subfolder, relative to the absolute path of this object. Note that this may also contain ‘..’ parts, as far as this does not go beyond the folder_limit.

  • create – if True, the new subfolder is created, if it does not exist.

  • reset_limit – when doing b = a.get_subfolder('xxx', reset_limit=False), the limit of b will be the same limit of a. if True, the limit will be set to the boundaries of folder b.

Returns:

a Folder object pointing to the subfolder.

insert_path(src, dest_name=None, overwrite=True)[source]#

Copy a file to the folder.

Parameters:
  • src – the source filename to copy

  • dest_name – if None, the same basename of src is used. Otherwise, the destination filename will have this file name.

  • overwrite – if False, raises an error on existing destination; otherwise, delete it first.

isdir(relpath)[source]#

Return True if ‘relpath’ exists inside the folder and is a directory, False otherwise.

isfile(relpath)[source]#

Return True if ‘relpath’ exists inside the folder and is a file, False otherwise.

property mode_dir#

Return the mode with which the folders should be created

property mode_file#

Return the mode with which the files should be created

open(name, mode='r', encoding='utf8', check_existence=False)[source]#

Open a file in the current folder and return the corresponding file object.

Parameters:

check_existence – if False, just return the file path. Otherwise, also check if the file or directory actually exists. Raise OSError if it does not.

remove_path(filename)[source]#

Remove a file or folder from the folder.

Parameters:

filename – the relative path name to remove

replace_with_folder(srcdir, move=False, overwrite=False)[source]#

This routine copies or moves the source folder ‘srcdir’ to the local folder pointed to by this Folder.

Parameters:
  • srcdir (str) – the source folder on the disk; this must be an absolute path

  • move (bool) – if True, the srcdir is moved to the repository. Otherwise, it is only copied.

  • overwrite (bool) – if True, the folder will be erased first. if False, an OSError is raised if the folder already exists. Whatever the value of this flag, parent directories will be created, if needed.

Raises:
  • OSError – in case of problems accessing or writing the files.

  • OSError – in case of problems accessing or writing the files (from shutil module).

  • ValueError – if the section is not recognized.

class aiida.common.folders.SandboxFolder(filepath: Path | None = None)[source]#

Bases: Folder

A class to manage the creation and management of a sandbox folder.

Note

This class should be used with a context manager to guarantee automatic cleanup:

with SandboxFolder() as folder:

# Do something with folder

__annotations__ = {}#
__enter__()[source]#

Enter a context and return self.

__exit__(exc_type, exc_value, traceback)[source]#

Erase the temporary directory created in the constructor.

__init__(filepath: Path | None = None)[source]#

Initialize a Folder object for an automatically created temporary directory.

Parameters:

filepath – A filepath to a directory to use for the sandbox folder. This path will be actually used as the base path and a random subfolder will be generated inside it. This will guarantee that multiple instances of the class can be created with the same value for filepath while guaranteeing they are independent.

__module__ = 'aiida.common.folders'#
class aiida.common.folders.SubmitTestFolder(basepath='submit_test')[source]#

Bases: Folder

Sandbox folder that can be used for the test submission of CalcJobs.

The directory will be created in the current working directory with a configurable basename. Then a sub folder will be created within this base folder based on the current date and an index in order to not overwrite already existing created test folders.

__annotations__ = {}#
__enter__()[source]#

Return the sub folder that should be Called when entering in the with statement.

__exit__(exc_type, exc_value, traceback)[source]#

When context manager is exited, do not delete the folder.

__init__(basepath='submit_test')[source]#

Construct and create the sandbox folder.

The directory will be created in the current working directory with the name given by basepath. Then a sub folder will be created within this base folder based on the current date and an index in order to not overwrite already existing created test folders.

Parameters:

basepath – name of the base directory that will be created in the current working directory

__module__ = 'aiida.common.folders'#
_sub_folder = None#

Common password and hash generation functions.

aiida.common.hashing._(folder: Folder, **kwargs) list[bytes][source]#

Hash the content of a Folder object. The name of the folder itself is actually ignored :param ignored_folder_content: list of filenames to be ignored for the hashing

aiida.common.hashing._make_hash(object_to_hash: Any, **_) list[bytes][source]#
aiida.common.hashing._make_hash(bytes_obj: bytes, **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: str, **kwargs) list[bytes]
aiida.common.hashing._make_hash(sequence_obj: Sequence, **kwargs) list[bytes]
aiida.common.hashing._make_hash(set_obj: Set, **kwargs) list[bytes]
aiida.common.hashing._make_hash(mapping: Mapping, **kwargs) list[bytes]
aiida.common.hashing._make_hash(mapping: OrderedDict, **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: Real, **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: Decimal, **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: Complex, **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: Integral, **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: bool, **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: type[None], **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: datetime, **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: date, **kwargs) list[bytes]
aiida.common.hashing._make_hash(val: UUID, **kwargs) list[bytes]
aiida.common.hashing._make_hash(datetime_precision: DatetimePrecision, **kwargs) list[bytes]
aiida.common.hashing._make_hash(folder: Folder, **kwargs) list[bytes]

Implementation of the make_hash function. The hash is created as a 28 byte integer, and only later converted to a string.

aiida.common.hashing._single_digest(obj_type: str, obj_bytes: bytes = b'') bytes[source]#
aiida.common.hashing.chunked_file_hash(handle: BinaryIO, hash_cls: Any, chunksize: int = 524288, **kwargs: Any) str[source]#

Return the hash for the given file handle

Will read the file in chunks, which should be opened in ‘rb’ mode.

Parameters:
  • handle – a file handle, opened in ‘rb’ mode.

  • hash_cls – a class implementing hashlib._Hash

  • chunksize – number of bytes to chunk the file read in

  • kwargs – arguments to pass to the hasher initialisation

Returns:

the hash hexdigest (the hash key)

aiida.common.hashing.float_to_text(value: SupportsFloat, sig: int) str[source]#

Convert float to text string for computing hash. Preseve up to N significant number given by sig.

Parameters:
  • value – the float value to convert

  • sig – choose how many digits after the comma should be output

aiida.common.hashing.get_random_string(length: int = 12) str[source]#

Return a securely generated random string.

The default length of 12 with the all ASCII letters and digits returns a 71-bit value:

log_2((26+26+10)^12) =~ 71 bits

Parameters:

length – The number of characters to use for the string.

aiida.common.hashing.make_hash(object_to_hash: Any, **kwargs) str[source]#

Makes a hash from a dictionary, list, tuple or set to any level, that contains only other hashable or nonhashable types (including lists, tuples, sets, and dictionaries).

Parameters:

object_to_hash – the object to hash

Returns:

a unique hash

There are a lot of modules providing functionalities to create unique hashes for hashable values. However, getting hashes for nonhashable items like sets or dictionaries is not easily doable because order is not fixed. This leads to the peril of getting different hashes for the same dictionary.

This function avoids this by recursing through nonhashable items and hashing iteratively. Uses python’s sorted function to sort unsorted sets and dictionaries by sorting the hashed keys.

Abstracts JSON usage to ensure compatibility with Python2 and Python3.

Use this module prefentially over standard json to ensure compatibility.

Deprecated since version This: module is deprecated in v2.0.0 and should no longer be used. Python 2 support has long since been dropped and for Python 3, one should simply use the json module of the standard library directly.

aiida.common.json.dump(data, handle, **kwargs)[source]#

Serialize data as a JSON formatted stream to handle.

We use ensure_ascii=False to write unicode characters specifically as this improves the readability of the json and reduces the file size.

aiida.common.json.dumps(data, **kwargs)[source]#

Serialize data as a JSON formatted string.

We use ensure_ascii=False to write unicode characters specifically as this improves the readability of the json and reduces the file size.

aiida.common.json.load(handle, **kwargs)[source]#

Deserialize handle text or binary file containing a JSON document to a Python object.

Raises:

ValueError – if no valid JSON object could be decoded.

aiida.common.json.loads(string, **kwargs)[source]#

Deserialize text or binary string containing a JSON document to a Python object.

Raises:

ValueError – if no valid JSON object could be decoded.

Utilities that extend the basic python language.

class aiida.common.lang.classproperty(getter: Callable[[SelfType], ReturnType])[source]#

Bases: Generic[ReturnType]

A class that, when used as a decorator, works as if the two decorators @property and @classmethod where applied together (i.e., the object works as a property, both for the Class and for any of its instance; and is called with the class cls rather than with the instance as its first argument).

__annotations__ = {}#
__dict__ = mappingproxy({'__module__': 'aiida.common.lang', '__doc__': 'A class that, when used as a decorator, works as if the\n    two decorators @property and @classmethod where applied together\n    (i.e., the object works as a property, both for the Class and for any\n    of its instance; and is called with the class cls rather than with the\n    instance as its first argument).\n    ', '__init__': <function classproperty.__init__>, '__get__': <function classproperty.__get__>, '__orig_bases__': (typing.Generic[~ReturnType],), '__dict__': <attribute '__dict__' of 'classproperty' objects>, '__weakref__': <attribute '__weakref__' of 'classproperty' objects>, '__parameters__': (~ReturnType,), '__annotations__': {}})#
__get__(instance: Any, owner: SelfType) ReturnType[source]#
__init__(getter: Callable[[SelfType], ReturnType]) None[source]#
__module__ = 'aiida.common.lang'#
__orig_bases__ = (typing.Generic[~ReturnType],)#
__parameters__ = (~ReturnType,)#
__weakref__#

list of weak references to the object (if defined)

aiida.common.lang.isidentifier(identifier)[source]#

Return whether the given string is a valid python identifier.

Returns:

boolean, True if identifier is valid, False otherwise

Raises:

TypeError – if identifier is not string type

aiida.common.lang.override(func: MethodType) MethodType#
aiida.common.lang.override_decorator(check=False) Callable[[MethodType], MethodType][source]#

Decorator to signal that a method from a base class is being overridden completely.

aiida.common.lang.type_check(what, of_type, msg=None, allow_none=False)[source]#

Verify that object ‘what’ is of type ‘of_type’ and if not the case, raise a TypeError.

Parameters:
  • what – the object to check

  • of_type – the type (or tuple of types) to compare to

  • msg – if specified, allows to customize the message that is passed within the TypeError exception

  • allow_none – boolean, if True will not raise if the passed what is None

Returns:

what or None

class aiida.common.links.GraphTraversalRule(link_type, direction, toggleable, default)#

Bases: tuple

A namedtuple that defines a graph traversal rule.

When starting from a certain sub set of nodes, the graph traversal rules specify which links should be followed to add adjacent nodes to finally arrive at a set of nodes that represent a valid and consistent sub graph.

Parameters:
  • link_type – the LinkType that the rule applies to

  • direction – whether the link type should be followed backwards or forwards

  • toggleable – boolean to indicate whether the rule can be changed from the default value. If this is False it means the default value can never be changed as it will result in an inconsistent graph.

  • default – boolean, the default value of the rule, if True means that the link type for the given direction should be followed.

__getnewargs__()#

Return self as a plain tuple. Used by copy and pickle.

__match_args__ = ('link_type', 'direction', 'toggleable', 'default')#
__module__ = 'aiida.common.links'#
static __new__(_cls, link_type, direction, toggleable, default)#

Create new instance of GraphTraversalRule(link_type, direction, toggleable, default)

__repr__()#

Return a nicely formatted representation string

__slots__ = ()#
_asdict()#

Return a new dict which maps field names to their values.

_field_defaults = {}#
_fields = ('link_type', 'direction', 'toggleable', 'default')#
classmethod _make(iterable)#

Make a new GraphTraversalRule object from a sequence or iterable

_replace(**kwds)#

Return a new GraphTraversalRule object replacing specified fields with new values

default#

Alias for field number 3

direction#

Alias for field number 1

Alias for field number 0

toggleable#

Alias for field number 2

class aiida.common.links.GraphTraversalRules(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]#

Bases: Enum

Graph traversal rules when deleting or exporting nodes.

DEFAULT = {'call_calc_backward': (LinkType.CALL_CALC, 'backward', True, False), 'call_calc_forward': (LinkType.CALL_CALC, 'forward', True, False), 'call_work_backward': (LinkType.CALL_WORK, 'backward', True, False), 'call_work_forward': (LinkType.CALL_WORK, 'forward', True, False), 'create_backward': (LinkType.CREATE, 'backward', True, False), 'create_forward': (LinkType.CREATE, 'forward', True, False), 'input_calc_backward': (LinkType.INPUT_CALC, 'backward', True, False), 'input_calc_forward': (LinkType.INPUT_CALC, 'forward', True, False), 'input_work_backward': (LinkType.INPUT_WORK, 'backward', True, False), 'input_work_forward': (LinkType.INPUT_WORK, 'forward', True, False), 'return_backward': (LinkType.RETURN, 'backward', True, False), 'return_forward': (LinkType.RETURN, 'forward', True, False)}#
DELETE = {'call_calc_backward': (LinkType.CALL_CALC, 'backward', False, True), 'call_calc_forward': (LinkType.CALL_CALC, 'forward', True, True), 'call_work_backward': (LinkType.CALL_WORK, 'backward', False, True), 'call_work_forward': (LinkType.CALL_WORK, 'forward', True, True), 'create_backward': (LinkType.CREATE, 'backward', False, True), 'create_forward': (LinkType.CREATE, 'forward', True, True), 'input_calc_backward': (LinkType.INPUT_CALC, 'backward', False, False), 'input_calc_forward': (LinkType.INPUT_CALC, 'forward', False, True), 'input_work_backward': (LinkType.INPUT_WORK, 'backward', False, False), 'input_work_forward': (LinkType.INPUT_WORK, 'forward', False, True), 'return_backward': (LinkType.RETURN, 'backward', False, True), 'return_forward': (LinkType.RETURN, 'forward', False, False)}#
EXPORT = {'call_calc_backward': (LinkType.CALL_CALC, 'backward', True, True), 'call_calc_forward': (LinkType.CALL_CALC, 'forward', False, True), 'call_work_backward': (LinkType.CALL_WORK, 'backward', True, True), 'call_work_forward': (LinkType.CALL_WORK, 'forward', False, True), 'create_backward': (LinkType.CREATE, 'backward', True, True), 'create_forward': (LinkType.CREATE, 'forward', False, True), 'input_calc_backward': (LinkType.INPUT_CALC, 'backward', False, True), 'input_calc_forward': (LinkType.INPUT_CALC, 'forward', True, False), 'input_work_backward': (LinkType.INPUT_WORK, 'backward', False, True), 'input_work_forward': (LinkType.INPUT_WORK, 'forward', True, False), 'return_backward': (LinkType.RETURN, 'backward', True, False), 'return_forward': (LinkType.RETURN, 'forward', False, True)}#
__module__ = 'aiida.common.links'#
class aiida.common.links.LinkType(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]#

Bases: Enum

A simple enum of allowed link types.

CALL_CALC = 'call_calc'#
CALL_WORK = 'call_work'#
CREATE = 'create'#
INPUT_CALC = 'input_calc'#
INPUT_WORK = 'input_work'#
RETURN = 'return'#
__module__ = 'aiida.common.links'#

Validate the given link label.

Valid link labels adhere to the following restrictions:

  • Has to be a valid python identifier

  • Can only contain alphanumeric characters and underscores

  • Can not start or end with an underscore

Raises:
  • TypeError – if the link label is not a string type

  • ValueError – if the link label is invalid

Module for all logging methods/classes that don’t need the ORM.

class aiida.common.log.AiidaLoggerType(name, level=0)[source]#

Bases: Logger

__annotations__ = {}#
__module__ = 'aiida.common.log'#
report(msg: str, *args, **kwargs) None[source]#

Log a message at the REPORT level.

aiida.common.log.CLI_ACTIVE: bool | None = None#

Flag that is set to True if the module is imported by verdi being called.

aiida.common.log.CLI_LOG_LEVEL: str | None = None#

Set if verdi is called with --verbosity flag specified, and is set to corresponding log level.

class aiida.common.log.LogLevels(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)#

Bases: Enum

CRITICAL = 'CRITICAL'#
DEBUG = 'DEBUG'#
ERROR = 'ERROR'#
INFO = 'INFO'#
NOTSET = 'NOTSET'#
REPORT = 'REPORT'#
WARNING = 'WARNING'#
__module__ = 'aiida.common.log'#
aiida.common.log.capture_logging(logger: ~logging.Logger = <Logger aiida (WARNING)>) Generator[StringIO, None, None][source]#

Capture logging to a stream in memory.

Note, this only copies any content that is being logged to a stream in memory. It does not interfere with any other existing stream handlers. In this sense, this context manager is non-destructive.

Parameters:

logger – The logger whose output to capture.

Returns:

A stream to which the logged content is captured.

aiida.common.log.configure_logging(with_orm=False, daemon=False, daemon_log_file=None)[source]#

Setup the logging by retrieving the LOGGING dictionary from aiida and passing it to the python module logging.config.dictConfig. If the logging needs to be setup for the daemon, set the argument ‘daemon’ to True and specify the path to the log file. This will cause a ‘daemon_handler’ to be added to all the configured loggers, that is a RotatingFileHandler that writes to the log file.

Parameters:
  • with_orm – configure logging to the backend storage. We don’t configure this by default, since it would load the modules that slow the CLI

  • daemon – configure the logging for a daemon task by adding a file handler instead of the default ‘console’ StreamHandler

  • daemon_log_file – absolute filepath of the log file for the RotatingFileHandler

aiida.common.log.evaluate_logging_configuration(dictionary)[source]#

Recursively evaluate the logging configuration, calling lambdas when encountered.

This allows the configuration options that are dependent on the active profile to be loaded lazily.

Returns:

evaluated logging configuration dictionary

aiida.common.log.get_logging_config()[source]#
aiida.common.log.override_log_level(level=50)[source]#

Temporarily adjust the log-level of logger.

aiida.common.log.report(self: Logger, msg, *args, **kwargs)[source]#

Log a message at the REPORT level.

Provide a singleton progress reporter implementation.

The interface is inspired by tqdm <https://github.com/tqdm/tqdm>, and indeed a valid implementation is:

from tqdm import tqdm
set_progress_reporter(tqdm, bar_format='{l_bar}{bar}{r_bar}')
aiida.common.progress_reporter.PROGRESS_REPORTER#

alias of ProgressReporterNull

class aiida.common.progress_reporter.ProgressReporterAbstract(*, total: int, desc: str | None = None, **kwargs: Any)[source]#

Bases: object

An abstract class for incrementing a progress reporter.

This class provides the base interface for any ProgressReporter class.

Example Usage:

with ProgressReporter(total=10, desc="A process:") as progress:
    for i in range(10):
        progress.set_description_str(f"A process: {i}")
        progress.update()
__dict__ = mappingproxy({'__module__': 'aiida.common.progress_reporter', '__doc__': 'An abstract class for incrementing a progress reporter.\n\n    This class provides the base interface for any `ProgressReporter` class.\n\n    Example Usage::\n\n        with ProgressReporter(total=10, desc="A process:") as progress:\n            for i in range(10):\n                progress.set_description_str(f"A process: {i}")\n                progress.update()\n\n    ', '__init__': <function ProgressReporterAbstract.__init__>, 'total': <property object>, 'desc': <property object>, 'n': <property object>, '__enter__': <function ProgressReporterAbstract.__enter__>, '__exit__': <function ProgressReporterAbstract.__exit__>, 'set_description_str': <function ProgressReporterAbstract.set_description_str>, 'update': <function ProgressReporterAbstract.update>, 'reset': <function ProgressReporterAbstract.reset>, '__dict__': <attribute '__dict__' of 'ProgressReporterAbstract' objects>, '__weakref__': <attribute '__weakref__' of 'ProgressReporterAbstract' objects>, '__annotations__': {'_increment': 'int'}})#
__enter__() ProgressReporterAbstract[source]#

Enter the contextmanager.

__exit__(exctype: Type[BaseException] | None, excinst: BaseException | None, exctb: TracebackType | None)[source]#

Exit the contextmanager.

__init__(*, total: int, desc: str | None = None, **kwargs: Any)[source]#

Initialise the progress reporting contextmanager.

Parameters:
  • total – The number of expected iterations.

  • desc – A description of the process

__module__ = 'aiida.common.progress_reporter'#
__weakref__#

list of weak references to the object (if defined)

property desc: str | None#

Return the description of the process.

property n: int#

Return the current iteration.

reset(total: int | None = None)[source]#

Resets current iterations to 0.

Parameters:

total – If not None, update number of expected iterations.

set_description_str(text: str | None = None, refresh: bool = True)[source]#

Set the text shown by the progress reporter.

Parameters:
  • text – The text to show

  • refresh – Force refresh of the progress reporter

property total: int#

Return the total iterations expected.

update(n: int = 1)[source]#

Update the progress counter.

Parameters:

n – Increment to add to the internal counter of iterations

class aiida.common.progress_reporter.ProgressReporterNull(*, total: int, desc: str | None = None, **kwargs: Any)[source]#

Bases: ProgressReporterAbstract

A null implementation of the progress reporter.

This implementation does not output anything.

__annotations__ = {}#
__module__ = 'aiida.common.progress_reporter'#
aiida.common.progress_reporter.create_callback(progress_reporter: ProgressReporterAbstract) Callable[[str, Any], None][source]#

Create a callback function to update the progress reporter.

Returns:

a callback to report on the process, callback(action, value), with the following callback signatures:

  • callback('init', {'total': <int>, 'description': <str>}),

    to reset the progress with a new total iterations and description

  • callback('update', <int>),

    to update the progress by a certain number of iterations

aiida.common.progress_reporter.get_progress_reporter() Type[ProgressReporterAbstract][source]#

Return the progress reporter

Example Usage:

with get_progress_reporter()(total=10, desc="A process:") as progress:
    for i in range(10):
        progress.set_description_str(f"A process: {i}")
        progress.update()
aiida.common.progress_reporter.set_progress_bar_tqdm(bar_format: str | None = '{desc:40.40}{percentage:6.1f}%|{bar}| {n_fmt}/{total_fmt}', leave: bool | None = False, **kwargs: Any)[source]#

Set a tqdm implementation of the progress reporter interface.

See set_progress_reporter() for details.

Parameters:
  • bar_format – Specify a custom bar string format.

  • leave – If True, keeps all traces of the progressbar upon termination of iteration. If None, will leave only if position is 0.

  • kwargs – pass to the tqdm init

aiida.common.progress_reporter.set_progress_reporter(reporter: Type[ProgressReporterAbstract] | None = None, **kwargs: Any)[source]#

Set the progress reporter implementation

Parameters:
  • reporter – A progress reporter for a process. If None, reset to ProgressReporterNull.

  • kwargs – If present, set a partial function with these kwargs

The reporter should be a context manager that implements the ProgressReporterAbstract() interface.

Example Usage:

set_progress_reporter(ProgressReporterNull)
with get_progress_reporter()(total=10, desc="A process:") as progress:
    for i in range(10):
        progress.set_description_str(f"A process: {i}")
        progress.update()

Utilities related to pydantic.

aiida.common.pydantic.MetadataField(default: Any | None = None, *, priority: int = 0, short_name: str | None = None, option_cls: Any | None = None, **kwargs)[source]#

Return a pydantic.fields.Field instance with additional metadata.

class Model(BaseModel):

    attribute: MetadataField('default', priority=1000, short_name='-A')

This is a utility function that constructs a Field instance with an easy interface to add additional metadata. It is possible to add metadata using Annotated:

class Model(BaseModel):

    attribute: Annotated[str, {'metadata': 'value'}] = Field(...)

However, when requiring multiple metadata, this notation can make the model difficult to read. Since this utility is only used to automatically build command line interfaces from the model definition, it is possible to restrict which metadata are accepted.

Parameters:
  • priority – Used to order the list of all fields in the model. Ordering is done from small to large priority.

  • short_name – Optional short name to use for an option on a command line interface.

  • option_cls – The click.Option class to use to construct the option.

Utility functions to operate on datetime objects.

aiida.common.timezone.delta(from_time: datetime, to_time: datetime | None = None) timedelta[source]#

Return the datetime object representing the different between two datetime objects.

Parameters:
  • from_time – The starting datetime object.

  • to_time – The end datetime object. If not specified aiida.common.timezone.now() is used.

Returns:

The delta datetime object.

aiida.common.timezone.localtime(value: datetime) datetime[source]#

Make a datetime.datetime object timezone aware with the local timezone.

Parameters:

value – The datetime object to make aware.

Returns:

A timezone aware datetime object with the timezone set to that of the operating system.

aiida.common.timezone.make_aware(value: datetime, tz: tzinfo | None = None) datetime[source]#

Make the given datetime object timezone aware.

Parameters:
  • value – The datetime object to make aware.

  • tz – The timezone to set. If not defined the system local timezone is assumed for the target timezone.

Returns:

A timezone aware datetime object.

aiida.common.timezone.now() datetime[source]#

Return the datetime object of the current time.

Returns:

datetime object represeting current time

aiida.common.timezone.timezone_from_name(name: str) tzinfo[source]#

Return a datetime.tzinfo instance corresponding to the given timezone name.

Parameters:

name – The timezone name. Should correspond to a known name in the Olsen database. https://en.wikipedia.org/wiki/Tz_database

Returns:

The corresponding timezone object.

Raises:

ValueError – if the timezone name is unknown.

Miscellaneous generic utility functions and classes.

class aiida.common.utils.ArrayCounter[source]#

Bases: object

A counter & a method that increments it and returns its value. It is used in various tests.

__dict__ = mappingproxy({'__module__': 'aiida.common.utils', '__doc__': 'A counter & a method that increments it and returns its value.\n    It is used in various tests.\n    ', 'seq': None, '__init__': <function ArrayCounter.__init__>, 'array_counter': <function ArrayCounter.array_counter>, '__dict__': <attribute '__dict__' of 'ArrayCounter' objects>, '__weakref__': <attribute '__weakref__' of 'ArrayCounter' objects>, '__annotations__': {}})#
__init__()[source]#
__module__ = 'aiida.common.utils'#
__weakref__#

list of weak references to the object (if defined)

array_counter()[source]#
seq = None#
class aiida.common.utils.Capturing(capture_stderr=False)[source]#

Bases: object

This class captures stdout and returns it (as a list, split by lines).

Note: if you raise a SystemExit, you have to catch it outside. E.g., in our tests, this works:

import sys
with self.assertRaises(SystemExit):
    with Capturing() as output:
        sys.exit()

But out of the testing environment, the code instead just exits.

To use it, access the obj.stdout_lines, or just iterate over the object

Parameters:

capture_stderr – if True, also captures sys.stderr. To access the lines, use obj.stderr_lines. If False, obj.stderr_lines is None.

__dict__ = mappingproxy({'__module__': 'aiida.common.utils', '__doc__': 'This class captures stdout and returns it\n    (as a list, split by lines).\n\n    Note: if you raise a SystemExit, you have to catch it outside.\n    E.g., in our tests, this works::\n\n        import sys\n        with self.assertRaises(SystemExit):\n            with Capturing() as output:\n                sys.exit()\n\n    But out of the testing environment, the code instead just exits.\n\n    To use it, access the obj.stdout_lines, or just iterate over the object\n\n    :param capture_stderr: if True, also captures sys.stderr. To access the\n        lines, use obj.stderr_lines. If False, obj.stderr_lines is None.\n    ', '__init__': <function Capturing.__init__>, '__enter__': <function Capturing.__enter__>, '__exit__': <function Capturing.__exit__>, '__str__': <function Capturing.__str__>, '__iter__': <function Capturing.__iter__>, '__dict__': <attribute '__dict__' of 'Capturing' objects>, '__weakref__': <attribute '__weakref__' of 'Capturing' objects>, '__annotations__': {}})#
__enter__()[source]#

Enter the context where all output is captured.

__exit__(*args)[source]#

Exit the context where all output is captured.

__init__(capture_stderr=False)[source]#

Construct a new instance.

__iter__()[source]#
__module__ = 'aiida.common.utils'#
__str__()[source]#

Return str(self).

__weakref__#

list of weak references to the object (if defined)

class aiida.common.utils.DatetimePrecision(dtobj, precision)[source]#

Bases: object

A simple class which stores a datetime object with its precision. No internal check is done (cause itis not possible).

precision: 1 (only full date)

2 (date plus hour) 3 (date + hour + minute) 4 (dare + hour + minute +second)

__dict__ = mappingproxy({'__module__': 'aiida.common.utils', '__doc__': 'A simple class which stores a datetime object with its precision. No\n    internal check is done (cause itis not possible).\n\n    precision:  1 (only full date)\n                2 (date plus hour)\n                3 (date + hour + minute)\n                4 (dare + hour + minute +second)\n    ', '__init__': <function DatetimePrecision.__init__>, '__dict__': <attribute '__dict__' of 'DatetimePrecision' objects>, '__weakref__': <attribute '__weakref__' of 'DatetimePrecision' objects>, '__annotations__': {}})#
__init__(dtobj, precision)[source]#

Constructor to check valid datetime object and precision

__module__ = 'aiida.common.utils'#
__weakref__#

list of weak references to the object (if defined)

class aiida.common.utils.ErrorAccumulator(*error_cls)[source]#

Bases: object

Allows to run a number of functions and collect all the errors they raise

This allows to validate multiple things and tell the user about all the errors encountered at once. Works best if the individual functions do not depend on each other.

Does not allow to trace the stack of each error, therefore do not use for debugging, but for semantical checking with user friendly error messages.

__dict__ = mappingproxy({'__module__': 'aiida.common.utils', '__doc__': 'Allows to run a number of functions and collect all the errors they raise\n\n    This allows to validate multiple things and tell the user about all the\n    errors encountered at once. Works best if the individual functions do not depend on each other.\n\n    Does not allow to trace the stack of each error, therefore do not use for debugging, but for\n    semantical checking with user friendly error messages.\n    ', '__init__': <function ErrorAccumulator.__init__>, 'run': <function ErrorAccumulator.run>, 'success': <function ErrorAccumulator.success>, 'result': <function ErrorAccumulator.result>, 'raise_errors': <function ErrorAccumulator.raise_errors>, '__dict__': <attribute '__dict__' of 'ErrorAccumulator' objects>, '__weakref__': <attribute '__weakref__' of 'ErrorAccumulator' objects>, '__annotations__': {}})#
__init__(*error_cls)[source]#
__module__ = 'aiida.common.utils'#
__weakref__#

list of weak references to the object (if defined)

raise_errors(raise_cls)[source]#
result(raise_error=<class 'Exception'>)[source]#
run(function, *args, **kwargs)[source]#
success()[source]#
class aiida.common.utils.Prettifier(format)[source]#

Bases: object

Class to manage prettifiers (typically for labels of kpoints in band plots)

__dict__ = mappingproxy({'__module__': 'aiida.common.utils', '__doc__': 'Class to manage prettifiers (typically for labels of kpoints\n    in band plots)\n    ', '_prettify_label_pass': <classmethod(<function Prettifier._prettify_label_pass>)>, '_prettify_label_agr': <classmethod(<function Prettifier._prettify_label_agr>)>, '_prettify_label_agr_simple': <classmethod(<function Prettifier._prettify_label_agr_simple>)>, '_prettify_label_gnuplot': <classmethod(<function Prettifier._prettify_label_gnuplot>)>, '_prettify_label_gnuplot_simple': <classmethod(<function Prettifier._prettify_label_gnuplot_simple>)>, '_prettify_label_latex': <classmethod(<function Prettifier._prettify_label_latex>)>, '_prettify_label_latex_simple': <classmethod(<function Prettifier._prettify_label_latex_simple>)>, 'prettifiers': <aiida.common.lang.classproperty object>, 'get_prettifiers': <classmethod(<function Prettifier.get_prettifiers>)>, '__init__': <function Prettifier.__init__>, 'prettify': <function Prettifier.prettify>, '__dict__': <attribute '__dict__' of 'Prettifier' objects>, '__weakref__': <attribute '__weakref__' of 'Prettifier' objects>, '__annotations__': {}})#
__init__(format)[source]#

Create a class to pretttify strings of a given format

Parameters:

format – a string with the format to use to prettify. Valid formats are obtained from self.prettifiers

__module__ = 'aiida.common.utils'#
__weakref__#

list of weak references to the object (if defined)

classmethod _prettify_label_agr(label)[source]#

Prettifier for XMGrace

Parameters:

label – a string to prettify

classmethod _prettify_label_agr_simple(label)[source]#

Prettifier for XMGrace (for old label names)

Parameters:

label – a string to prettify

classmethod _prettify_label_gnuplot(label)[source]#

Prettifier for Gnuplot

Note:

uses unicode, returns unicode strings (potentially, if needed)

Parameters:

label – a string to prettify

classmethod _prettify_label_gnuplot_simple(label)[source]#

Prettifier for Gnuplot (for old label names)

Note:

uses unicode, returns unicode strings (potentially, if needed)

Parameters:

label – a string to prettify

classmethod _prettify_label_latex(label)[source]#

Prettifier for matplotlib, using LaTeX syntax

Parameters:

label – a string to prettify

classmethod _prettify_label_latex_simple(label)[source]#

Prettifier for matplotlib, using LaTeX syntax (for old label names)

Parameters:

label – a string to prettify

classmethod _prettify_label_pass(label)[source]#

No-op prettifier, simply returns the same label

Parameters:

label – a string to prettify

classmethod get_prettifiers()[source]#

Return a list of valid prettifier strings

Returns:

a list of strings

prettifiers = {'agr_seekpath': <bound method Prettifier._prettify_label_agr of <class 'aiida.common.utils.Prettifier'>>, 'agr_simple': <bound method Prettifier._prettify_label_agr_simple of <class 'aiida.common.utils.Prettifier'>>, 'gnuplot_seekpath': <bound method Prettifier._prettify_label_gnuplot of <class 'aiida.common.utils.Prettifier'>>, 'gnuplot_simple': <bound method Prettifier._prettify_label_gnuplot_simple of <class 'aiida.common.utils.Prettifier'>>, 'latex_seekpath': <bound method Prettifier._prettify_label_latex of <class 'aiida.common.utils.Prettifier'>>, 'latex_simple': <bound method Prettifier._prettify_label_latex_simple of <class 'aiida.common.utils.Prettifier'>>, 'pass': <bound method Prettifier._prettify_label_pass of <class 'aiida.common.utils.Prettifier'>>}#
prettify(label)[source]#

Prettify a label using the format passed in the initializer

Parameters:

label – the string to prettify

Returns:

a prettified string

aiida.common.utils.are_dir_trees_equal(dir1, dir2)[source]#

Compare two directories recursively. Files in each directory are assumed to be equal if their names and contents are equal.

@param dir1: First directory path @param dir2: Second directory path

@return: True if the directory trees are the same and

there were no errors while accessing the directories or files, False otherwise.

aiida.common.utils.get_class_string(obj)[source]#

Return the string identifying the class of the object (module + object name, joined by dots).

It works both for classes and for class instances.

aiida.common.utils.get_new_uuid() str[source]#

Return a new UUID (typically to be used for new nodes).

aiida.common.utils.get_object_from_string(class_string)[source]#

Given a string identifying an object (as returned by the get_class_string method) load and return the actual object.

aiida.common.utils.get_unique_filename(filename, list_of_filenames)[source]#

Return a unique filename that can be added to the list_of_filenames.

If filename is not in list_of_filenames, it simply returns the filename string itself. Otherwise, it appends a integer number to the filename (before the extension) until it finds a unique filename.

Parameters:
  • filename – the filename to add

  • list_of_filenames – the list of filenames to which filename should be added, without name duplicates

Returns:

Either filename or its modification, with a number appended between the name and the extension.

aiida.common.utils.grouper(n, iterable)[source]#

Given an iterable, returns an iterable that returns tuples of groups of elements from iterable of length n, except the last one that has the required length to exaust iterable (i.e., there is no filling applied).

Parameters:
  • n – length of each tuple (except the last one,that will have length <= n

  • iterable – the iterable to divide in groups

aiida.common.utils.join_labels(labels, join_symbol='|', threshold=1e-06)[source]#

Join labels with a joining symbol when they are very close

Parameters:
  • labels – a list of length-2 tuples, in the format(position, label)

  • join_symbol – the string to use to join different paths. By default, a pipe

  • threshold – the threshold to decide if two float values are the same and should be joined

Returns:

the same list as labels, but with the second value possibly replaced with strings joined when close enough

aiida.common.utils.prettify_labels(labels, format=None)[source]#

Prettify label for typesetting in various formats

Parameters:
  • labels – a list of length-2 tuples, in the format(position, label)

  • format – a string with the format for the prettifier (e.g. ‘agr’, ‘matplotlib’, …)

Returns:

the same list as labels, but with the second value possibly replaced with a prettified version that typesets nicely in the selected format

aiida.common.utils.str_timedelta(dt, max_num_fields=3, short=False, negative_to_zero=False)[source]#

Given a dt in seconds, return it in a HH:MM:SS format.

Parameters:
  • dt – a TimeDelta object

  • max_num_fields – maximum number of non-zero fields to show (for instance if the number of days is non-zero, shows only days, hours and minutes, but not seconds)

  • short – if False, print always max_num_fields fields, even if they are zero. If True, do not print the first fields, if they are zero.

  • negative_to_zero – if True, set dt = 0 if dt < 0.

aiida.common.utils.strip_prefix(full_string, prefix)[source]#

Strip the prefix from the given string and return it. If the prefix is not present the original string will be returned unaltered

Parameters:
  • full_string – the string from which to remove the prefix

  • prefix – the prefix to remove

Returns:

the string with prefix removed

aiida.common.utils.validate_list_of_string_tuples(val, tuple_length)[source]#

Check that:

  1. val is a list or tuple

  2. each element of the list:

  1. is a list or tuple

  2. is of length equal to the parameter tuple_length

  3. each of the two elements is a string

Return if valid, raise ValidationError if invalid

aiida.common.utils.validate_uuid(given_uuid: str) bool[source]#

A simple check for the UUID validity.

Define warnings that can be thrown by AiiDA.

exception aiida.common.warnings.AiidaDeprecationWarning[source]#

Bases: Warning

Class for AiiDA deprecations.

It does not inherit, on purpose, from DeprecationWarning as this would be filtered out by default. Enabled by default, you can disable it by running in the shell:

verdi config warnings.showdeprecations False
__module__ = 'aiida.common.warnings'#
__weakref__#

list of weak references to the object (if defined)

exception aiida.common.warnings.AiidaEntryPointWarning[source]#

Bases: Warning

Class for warnings concerning AiiDA entry points.

__module__ = 'aiida.common.warnings'#
__weakref__#

list of weak references to the object (if defined)

exception aiida.common.warnings.AiidaTestWarning[source]#

Bases: Warning

Class for warnings concerning the AiiDA testing infrastructure.

__module__ = 'aiida.common.warnings'#
__weakref__#

list of weak references to the object (if defined)

aiida.common.warnings.warn_deprecation(message: str, version: int, stacklevel=2) None[source]#

Warn about a deprecation for a future aiida-core version.

Warnings are emitted if the warnings.showdeprecations config option is set to True. Its value can be overwritten with the AIIDA_WARN_v{version} environment variable. The exact value for the environment variable is irrelevant. Any value will mean the variable is enabled and warnings will be emitted.

Parameters:
  • message – the message to be printed

  • version – the major version number of the future version

  • stacklevel – the stack level at which the warning is issued