aiida.work package

Submodules

class aiida.work.class_loader.ClassLoader(parent=None)[source]

Bases: plum.class_loader.ClassLoader

__module__ = 'aiida.work.class_loader'
find_class(name)[source]

Load a class from a string

static is_wrapped_job_calculation()[source]
aiida.work.class_loader.get_default()[source]
aiida.work.daemon.tick_workflow_engine(storage=None, print_exceptions=True)[source]
aiida.work.db_types.get_db_type(native_type)[source]
aiida.work.db_types.to_db_type(value)[source]
aiida.work.db_types.to_native_type(data)[source]
class aiida.work.execution_engine.ExecutionEngine(max_workers=None)[source]

Bases: plum.engine.parallel.MultithreadedEngine

__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.execution_engine'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 119
_abc_registry = <_weakrefset.WeakSet object>
class aiida.work.interstep.Action(running_info, fn)

Bases: tuple

__dict__ = dict_proxy({'__module__': 'aiida.work.interstep', '__getstate__': <function __getstate__>, '__new__': <staticmethod object>, '_make': <classmethod object>, '_replace': <function _replace>, '__slots__': (), '_asdict': <function _asdict>, '__repr__': <function __repr__>, '__dict__': <property object>, '_fields': ('running_info', 'fn'), '__getnewargs__': <function __getnewargs__>, '__doc__': 'Action(running_info, fn)', 'fn': <property object>, 'running_info': <property object>})
__getnewargs__()

Return self as a plain tuple. Used by copy and pickle.

__getstate__()

Exclude the OrderedDict from pickling

__module__ = 'aiida.work.interstep'
static __new__(running_info, fn)

Create new instance of Action(running_info, fn)

__repr__()

Return a nicely formatted representation string

__slots__ = ()
_asdict()

Return a new OrderedDict which maps field names to their values

_fields = ('running_info', 'fn')
classmethod _make(iterable, new=<built-in method __new__ of type object at 0x906d60>, len=<built-in function len>)

Make a new Action object from a sequence or iterable

_replace(**kwds)

Return a new Action object replacing specified fields with new values

fn

Alias for field number 1

running_info

Alias for field number 0

class aiida.work.interstep.Append(key, action)[source]

Bases: aiida.work.interstep.UpdateContext

This interstep will append the value returned by the registered action to a specific key in the context

class Builder(value)[source]

Bases: aiida.work.interstep.UpdateContextBuilder

__module__ = 'aiida.work.interstep'
build(key)[source]
__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.interstep'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
on_next_step_starting(workchain)[source]

Appends the result stored in the action in the key of the workchain context

Parameters:workchain (aiida.work.WorkChain) – WorkChain whose context should be updated
class aiida.work.interstep.Assign(key, action)[source]

Bases: aiida.work.interstep.UpdateContext

This interstep will assign the value returned by the registered action to a specific key in the context

class Builder(value)[source]

Bases: aiida.work.interstep.UpdateContextBuilder

__module__ = 'aiida.work.interstep'
build(key)[source]
__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.interstep'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
on_next_step_starting(workchain)[source]

Assigns the result stored in the action in the key of the workchain context

Parameters:workchain (aiida.work.WorkChain) – WorkChain whose context should be updated
aiida.work.interstep.Calc(running_info)[source]

Creates an Action tuple based on a RunningInfo tuple for a legacy calculation node

Parameters:running_info – RunningInfo tuple
Returns:Action tuple
class aiida.work.interstep.Interstep[source]

Bases: aiida.work.interstep.Savable

An interstep is an action that is performed between steps of a workchain. These allow the user to perform action when a step is finished and when the next step (if there is one) is about the start.

CLASS_NAME = 'class_name'
__abstractmethods__ = frozenset([])
__metaclass__

alias of abc.ABCMeta

__module__ = 'aiida.work.interstep'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
on_last_step_finished(workchain)[source]

Called when the last step has finished

Parameters:workchain (aiida.work.WorkChain) – The workchain this interstep belongs to
on_next_step_starting(workchain)[source]

Called when the next step is about to start

Parameters:workchain (aiida.work.WorkChain) – The workchain this interstep belongs to
save_instance_state(out_state)[source]

Store the information of the instance in a bundle that is required at a minimum to allow it to be reconstructed

Parameters:out_state – a bundle in which to store the information
aiida.work.interstep.Legacy(running_info)[source]

Creates an Action tuple based on a RunningInfo tuple for a legacy calculation or workflow node

Parameters:running_info – RunningInfo tuple
Returns:Action tuple
aiida.work.interstep.Outputs(running_info)[source]

Convenience proxy function to allow returning the outputs generated by a process

class aiida.work.interstep.Savable[source]

Bases: object

__dict__ = dict_proxy({'__module__': 'aiida.work.interstep', 'load_instance_state': <function load_instance_state>, '__dict__': <attribute '__dict__' of 'Savable' objects>, 'save_instance_state': <function save_instance_state>, 'create_from': <classmethod object>, '__weakref__': <attribute '__weakref__' of 'Savable' objects>, '__doc__': None})
__module__ = 'aiida.work.interstep'
__weakref__

list of weak references to the object (if defined)

classmethod create_from(saved_state)[source]

Create the wait on from a save instance state.

Parameters:saved_state (plum.persistence.Bundle) – The saved instance state
Returns:The wait on with its state as it was when it was saved
load_instance_state(saved_state)[source]
save_instance_state(out_state)[source]
class aiida.work.interstep.UpdateContext(key, action)[source]

Bases: aiida.work.interstep.Interstep

Intersteps that evaluate an action and store the results in the context of the Process

__abstractmethods__ = frozenset([])
__eq__(other)[source]

x.__eq__(y) <==> x==y

__init__(key, action)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.interstep'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
_create_wait_on()[source]

Creates the waiton based on the running info stored in the action that will instruct the workchain what to wait for

load_instance_state(saved_state)[source]
on_last_step_finished(workchain)[source]

Insert the barrier into the workchain by creating the Interstep’s waiton

save_instance_state(out_state)[source]

Store the information of the instance in a bundle that is required at a minimum to allow it to be reconstructed

Parameters:out_state – a bundle in which to store the information
class aiida.work.interstep.UpdateContextBuilder(value)[source]

Bases: object

A builder of an UpdateContext instance. The key components of an UpdateContext Interstep are the key and the action, which will not be available at the same time of construction within the workchain. This builder class serves as an intermediate step, registering the value of the Interstep. Calling the build method will then construct a fully defined UpdateContext interstep instance

__dict__ = dict_proxy({'__module__': 'aiida.work.interstep', 'build': <function build>, '__dict__': <attribute '__dict__' of 'UpdateContextBuilder' objects>, '__weakref__': <attribute '__weakref__' of 'UpdateContextBuilder' objects>, '__doc__': '\n A builder of an UpdateContext instance. The key components of an UpdateContext Interstep\n are the key and the action, which will not be available at the same time of construction\n within the workchain. This builder class serves as an intermediate step, registering\n the value of the Interstep. Calling the build method will then construct a fully defined\n UpdateContext interstep instance\n ', '__init__': <function __init__>})
__init__(value)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.interstep'
__weakref__

list of weak references to the object (if defined)

build(key)[source]
aiida.work.interstep.Wf(running_info)[source]

Creates an Action tuple based on a RunningInfo tuple for a legacy Workflow node

Parameters:running_info – RunningInfo tuple
Returns:Action tuple
aiida.work.interstep._get_proc_outputs_from_registry(pid)[source]

Return a dictionary of outputs for a calculation identified by pid

aiida.work.interstep._get_wf_outputs(pk)[source]

Return the results dictionary of a legacy workflow

aiida.work.interstep.action_from_running_info(running_info)[source]

Creates an Action tuple based on a RunningInfo tuple

Parameters:running_info – RunningInfo tuple
Returns:Action tuple
aiida.work.interstep.append_

alias of aiida.work.interstep.Builder

aiida.work.interstep.assign_

alias of aiida.work.interstep.Builder

aiida.work.interstep.load_with_classloader(bundle)[source]

Load a process from a saved instance state

Parameters:bundle – The saved instance state bundle
Returns:The process instance
Return type:aiida.work.process.Process
class aiida.work.persistence.Persistence(running_directory='/tmp/running', finished_directory='/tmp/running/finished', failed_directory='/tmp/running/failed')[source]

Bases: plum.persistence.pickle_persistence.PicklePersistence

Class that uses pickles stored in particular directories to persist the instance state of Processes.

__abstractmethods__ = frozenset([])
__init__(running_directory='/tmp/running', finished_directory='/tmp/running/finished', failed_directory='/tmp/running/failed')[source]

Create the pickle persistence object. If auto_persist is True then this object will automatically persist any Processes that are created and will keep their persisted state up to date as they run. By default this is turned off as the user may prefer to manually specify which process should be persisted.

The directory structure that will be used is:

running_directory/[pid].pickle - Currently active processes finished_directory/[pid].pickle - Finished processes failed_directory/[pid].pickle - Failed processes

Parameters:
  • auto_persist (bool) – Will automatically persist Processes if True.
  • running_directory (str) – The base directory to store all pickles in.
  • finished_directory (str) – The (relative) subdirectory to put finished Process pickles in. If None they will be deleted when finished.
  • failed_directory (str) – The (relative) subdirectory to put failed Process pickles in. If None they will be deleted on fail.
__module__ = 'aiida.work.persistence'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
_clear(fileobj)[source]

Clear the contents of an open file.

Parameters:fileobj – The (open) file object
static _ensure_directory()[source]
_load_checkpoint(pid)[source]

Load a checkpoint from a pickle. Note that this will not properly check for locks and should not be called outside of this class

_release_process(pid, save_dir=None)[source]

Move a running process pickle to the given save directory, this is typically used if the process has finished or failed.

Parameters:
  • pid – The process ID
  • save_dir (str or None) – The directory to move to pickle to, can be None indicating that the pickle should be deleted.
_save_noraise(process)[source]
clear_all_persisted()[source]
create_bundle(process)[source]
classmethod create_from_basedir(basedir, **kwargs)[source]
Create using a base directory, the pickles will be stored in:
  • running: [basedir]/running
  • finished: [basedir]/finished
  • failed: [basedir]/failed
Parameters:
  • basedir (str) – The base directory to storage pickle under
  • kwargs – Any additional arguments to pass to the constructor
Returns:

A new instance.

create_from_file_and_persist(filepath)[source]

Try and load a process from a file and recreate the Process instance. To prevent multiple threads from recreating a Process from the same pickle, before loading the state from the file, a reentrant lock is created, which will except if the file is already locked. Within a lock context manager, we then attempt to recreate the Process from the process state and when successful we acquire the lock.

Parameters:filepath – path to the pickle to be loaded as a Process
Returns:Process instance
failed_directory
finished_directory
get_checkpoint_state(pid)[source]
get_running_path(pid)[source]

Get the path where the pickle for a process with pid will be stored while it’s running.

Parameters:pid – The process pid
Returns:A string to the absolute path of where the pickle is stored.
Return type:str
load_all_processes()[source]

Will detect all pickles in the running directory and will try to load them up into Processes. As soon as a pickle is considered for loading, a lock is placed on it, which is not released until the process is destroyed. This is necessary to prevent another thread from loading up the same process.

Returns:a list of Process instances
load_checkpoint_from_file_object(file_object)[source]
on_monitored_process_created(process)[source]
on_monitored_process_failed(pid)[source]
on_process_destroy(process)[source]
on_process_finish(process)[source]
on_process_run(process)[source]
on_process_wait(process, wait_on)[source]
persist_process(process)[source]
static pickle_filename()[source]
save(process)[source]
store_directory
unpersist_process(process)[source]
class aiida.work.persistence.RLock(filename, mode='a', timeout=5, check_interval=0.25, fail_when_locked=False, flags=6)[source]

Bases: portalocker.utils.Lock

reentrant lock.

Functions in a similar way to threading.RLock in that it can be acquired multiple times. When the corresponding number of release() calls are made the lock will finally release the underlying file lock.

__init__(filename, mode='a', timeout=5, check_interval=0.25, fail_when_locked=False, flags=6)[source]

Constructor

__module__ = 'aiida.work.persistence'
acquire(timeout=None, check_interval=None, fail_when_locked=None)[source]

Acquire the locked filehandle

release()[source]

Releases the currently locked file handle

aiida.work.persistence._create_storage()[source]
aiida.work.persistence.get_default()[source]
class aiida.work.process.DictSchema(schema)[source]

Bases: object

__call__(value)[source]

Call this to validate the value against the schema.

Parameters:value – a regular dictionary or a ParameterData instance
Returns:tuple (success, msg). success is True if the value is valid and False otherwise, in which case msg will contain information about the validation failure.
Return type:tuple
__dict__ = dict_proxy({'__module__': 'aiida.work.process', 'get_template': <function get_template>, '__dict__': <attribute '__dict__' of 'DictSchema' objects>, '_get_template': <function _get_template>, '__call__': <function __call__>, '__weakref__': <attribute '__weakref__' of 'DictSchema' objects>, '__doc__': None, '__init__': <function __init__>})
__init__(schema)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.process'
__weakref__

list of weak references to the object (if defined)

_get_template(dict)[source]
get_template()[source]
class aiida.work.process.FunctionProcess[source]

Bases: aiida.work.process.Process

__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.process'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
static _func(**kwargs)[source]

This is used internally to store the actual function that is being wrapped and will be replaced by the build method.

_func_args = None
_run(**kwargs)[source]
_setup_db_record()[source]
classmethod args_to_dict(*args)[source]

Create an input dictionary (i.e. label: value) from supplied args.

Parameters:args – The values to use
Returns:A label: value dictionary
static build(**kwargs)[source]

Build a Process from the given function. All function arguments will be assigned as process inputs. If keyword arguments are specified then these will also become inputs.

Parameters:
  • func – The function to build a process from
  • kwargs – Optional keyword arguments that will become additional inputs to the process
Returns:

A Process class that represents the function

Return type:

Process

class aiida.work.process.Process[source]

Bases: plum.process.Process

This class represents an AiiDA process which can be executed and will have full provenance saved in the database.

SINGLE_RETURN_LINKNAME = '_return'
class SaveKeys[source]

Bases: enum.Enum

Keys used to identify things in the saved instance state bundle.

CALC_ID = 'calc_id'
PARENT_CALC_PID = 'parent_calc_pid'
__module__ = 'aiida.work.process'
__abstractmethods__ = frozenset(['_run'])
__init__()[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__metaclass__

alias of abc.ABCMeta

__module__ = 'aiida.work.process'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
_add_description_and_label()[source]
_create_and_setup_db_record()[source]
classmethod _create_default_exec_engine()[source]

Crate the default execution engine. Used if the run() method is called instead of asking an execution engine to run this process.

Returns:An instance of ExceutionEngine.
_on_output_emitted(output_port, value, dynamic)[source]

The process has emitted a value on the given output port.

Parameters:
  • output_port – The output port name the value was emitted on
  • value – The value emitted
  • dynamic – Was the output port a dynamic one (i.e. not known beforehand?)
_setup_db_record()[source]
_spec_type

alias of ProcessSpec

_use_cache_enabled()[source]
calc
classmethod create_db_record()[source]

Create a database calculation node that represents what happened in this process. :return:

classmethod define(spec)[source]
do_run()[source]
classmethod get_inputs_template()[source]
get_parent_calc()[source]
get_provenance_inputs_iterator()[source]
on_create(pid, inputs, saved_instance_state)[source]

Called when the process is created. If a checkpoint is supplied the process should reinstate its state at the time the checkpoint was taken and if the checkpoint has a wait_on the process will continue from the corresponding callback function.

Parameters:inputs – The inputs the process should run using.
on_destroy()[source]

Called when a Process enters the DESTROYED state which should be the final process state and so we seal the calculation node

on_finish()[source]

Called when a Process enters the FINISHED state at which point we set the corresponding attribute of the workcalculation node

on_start()[source]

Called when the process is about to start for the first time.

Any class overriding this method should make sure to call the super method, usually at the end of the function.

out(output_port, value=None)[source]
report(msg, *args, **kwargs)[source]

Log a message to the logger, which should get saved to the database through the attached DbLogHandler. The class name and function name of the caller are prepended to the given message

run_after_queueing(wait_on)[source]
save_instance_state(bundle)[source]
class aiida.work.process.ProcessSpec[source]

Bases: plum.process_spec.ProcessSpec

__init__()[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.process'
fastforwardable()[source]
get_inputs_template()[source]

Get an object that represents a template of the known inputs and their defaults for the Process.

Returns:An object with attributes that represent the known inputs for this process. Default values will be filled in.
is_fastforwardable()[source]
class aiida.work.process._ProcessFinaliser[source]

Bases: plum.process_monitor.ProcessMonitorListener

Take care of finalising a process when it finishes either through successful completion or because of a failure caused by an exception.

__abstractmethods__ = frozenset([])
__init__()[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.process'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
on_monitored_process_destroying(process)[source]
on_monitored_process_failed(pid)[source]
class aiida.work.process_registry.ProcessRegistry[source]

Bases: plum.knowledge_provider.KnowledgeProvider

This class is a knowledge provider that uses the AiiDA database to answer questions related to processes.

__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.process_registry'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
current_calc_node
current_pid
get_inputs(pid)[source]

Get all the inputs for a given process.

Parameters:pid – The process id.
Returns:A dictionary of the corresponding port names and input values.
Return type:dict
Raises:NotKnown
get_outputs(pid)[source]

Get all the outputs from a process.

Parameters:pid – The process id
Returns:A dictionary containing label: value entries.
Raises:NotKnown
has_finished(pid)[source]

Has the process finished?

Parameters:pid – The process id.
Returns:True if finished, False otherwise.
Return type:bool
Raises:NotKnown
class aiida.work.run.RunningInfo(type, pid)

Bases: tuple

__dict__ = dict_proxy({'__module__': 'aiida.work.run', '__getstate__': <function __getstate__>, '__new__': <staticmethod object>, '_make': <classmethod object>, 'pid': <property object>, '_replace': <function _replace>, '__slots__': (), '_asdict': <function _asdict>, '__repr__': <function __repr__>, '__dict__': <property object>, '_fields': ('type', 'pid'), '__getnewargs__': <function __getnewargs__>, 'type': <property object>, '__doc__': 'RunningInfo(type, pid)'})
__getnewargs__()

Return self as a plain tuple. Used by copy and pickle.

__getstate__()

Exclude the OrderedDict from pickling

__module__ = 'aiida.work.run'
static __new__(type, pid)

Create new instance of RunningInfo(type, pid)

__repr__()

Return a nicely formatted representation string

__slots__ = ()
_asdict()

Return a new OrderedDict which maps field names to their values

_fields = ('type', 'pid')
classmethod _make(iterable, new=<built-in method __new__ of type object at 0x906d60>, len=<built-in function len>)

Make a new RunningInfo object from a sequence or iterable

_replace(**kwds)

Return a new RunningInfo object replacing specified fields with new values

pid

Alias for field number 1

type

Alias for field number 0

class aiida.work.run.RunningType[source]

Bases: enum.Enum

A type to indicate what type of object is running: a process, a calculation or a workflow

LEGACY_CALC = 1
LEGACY_WORKFLOW = 2
PROCESS = 0
__module__ = 'aiida.work.run'
aiida.work.run.legacy_calc(pk)[source]

Create a RunningInfo object for a legacy calculation

Parameters:pk (int) – The calculation pk
Returns:The running info
Return type:RunningInfo
aiida.work.run.legacy_workflow(pk)[source]

Create a RunningInfo object for a legacy workflow.

This can be used in conjunction with aiida.work.workchain.ToContext as follows:

>>> from aiida.work.workchain import WorkChain, ToContext, Outputs
>>>
>>> class MyWf(WorkChain):
>>>     @classmethod
>>>     def define(cls, spec):
>>>         super(MyWf, cls).define(spec)
>>>         spec.outline(cls.step1, cls.step2)
>>>
>>>     def step1(self):
>>>         wf = OldEquationOfState()
>>>         wf.start()
>>>         return ToContext(eos=legacy_workflow(wf.pk))
>>>
>>>     def step2(self):
>>>         # Now self.ctx.eos contains the terminated workflow
>>>         pass
Parameters:pk (int) – The workflow pk
Returns:The running info
Return type:RunningInfo
aiida.work.run.queue_up(process_class, inputs, storage)[source]

This queues up the Process so that it’s executed by the daemon when it gets around to it.

Parameters:
  • process_class – The process class to queue up.
  • inputs (collections.Mapping) – The inputs to the process.
  • storage – The storage engine which will be used to save the process (of type plum.persistence)
Returns:

The pid of the queued process.

aiida.work.run.run(process_class, *args, **inputs)[source]

Synchronously (i.e. blocking) run a workfunction or process.

Parameters:
  • process_class – The process class or workfunction
  • _attributes – Optional attributes (only for process)
  • args – Positional arguments for a workfunction
  • inputs – The list of inputs
aiida.work.run.submit(process_class, _jobs_store=None, **kwargs)[source]
class aiida.work.test_utils.BadOutput[source]

Bases: aiida.work.process.Process

A Process that emits an output that isn’t part of the spec raising an exception.

__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.test_utils'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
_run()[source]
classmethod define(spec)[source]
class aiida.work.test_utils.DummyProcess[source]

Bases: aiida.work.process.Process

A Process that does nothing when it runs.

__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.test_utils'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
_run(**kwargs)[source]
classmethod define(spec)[source]
class aiida.work.test_utils.ExceptionProcess[source]

Bases: aiida.work.process.Process

__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.test_utils'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
_run()[source]
class aiida.work.util.ProcessStack[source]

Bases: object

Keep track of the per-thread call stack of processes.

__dict__ = dict_proxy({'__module__': 'aiida.work.util', '__dict__': <attribute '__dict__' of 'ProcessStack' objects>, 'get_active_process_id': <classmethod object>, 'top': <classmethod object>, '_thread_local': <thread._local object>, '__doc__': '\n Keep track of the per-thread call stack of processes.\n ', 'get_active_process_calc_node': <classmethod object>, 'pop': <classmethod object>, 'push': <classmethod object>, 'pids': <classmethod object>, '__weakref__': <attribute '__weakref__' of 'ProcessStack' objects>, 'stack': <classmethod object>, '__init__': <function __init__>})
__init__()[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.util'
__weakref__

list of weak references to the object (if defined)

_thread_local = <thread._local object>
classmethod get_active_process_calc_node()[source]

Get the calculation node of the process at the top of the stack

Returns:The calculation node
Return type:aiida.orm.implementation.general.calculation.job.AbstractJobCalculation
classmethod get_active_process_id()[source]

Get the pid of the process at the top of the stack

Returns:The pid
classmethod pids()[source]
classmethod pop(process=None, pid=None)[source]

Pop a process from the stack. To make sure the stack is not corrupted the process instance or pid of the calling process should be supplied so we can verify that is really is top of the stack.

Parameters:
  • process – The process instance
  • pid – The process id.
classmethod push(process)[source]
classmethod stack()[source]
classmethod top()[source]
aiida.work.util.get_or_create_output_group(calculation)[source]

For a given Calculation, get or create a new frozendict Data node that has as its values all output Data nodes of the Calculation.

Parameters:calculation – Calculation
aiida.work.util.is_workfunction(func)[source]
aiida.work.util.load_class(classstring)[source]

Load a class from a string

class aiida.work.workchain.Stepper(workflow)[source]

Bases: object

__abstractmethods__ = frozenset(['step', 'save_position', 'load_position'])
__dict__ = dict_proxy({'__module__': 'aiida.work.workchain', '__metaclass__': <class 'abc.ABCMeta'>, 'load_position': <function load_position>, '_abc_negative_cache': <_weakrefset.WeakSet object>, 'step': <function step>, '__dict__': <attribute '__dict__' of 'Stepper' objects>, '__weakref__': <attribute '__weakref__' of 'Stepper' objects>, '__init__': <function __init__>, '_abc_cache': <_weakrefset.WeakSet object>, 'save_position': <function save_position>, '__abstractmethods__': frozenset(['step', 'save_position', 'load_position']), '_abc_negative_cache_version': 94, '_abc_registry': <_weakrefset.WeakSet object>, '__doc__': None})
__init__(workflow)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__metaclass__

alias of abc.ABCMeta

__module__ = 'aiida.work.workchain'
__weakref__

list of weak references to the object (if defined)

_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
load_position(bundle)[source]
save_position(out_position)[source]
step()[source]

Execute on step of the instructions.

Returns:A 2-tuple with entries 0. True if the stepper has finished, False otherwise 1. The return value from the executed step
Return type:tuple
aiida.work.workchain.ToContext(**kwargs)[source]

Utility function that returns a list of UpdateContext Interstep instances

NOTE: This is effectively a copy of WorkChain.to_context method added to keep backwards compatibility, but should eventually be deprecated

class aiida.work.workchain.WorkChain[source]

Bases: aiida.work.process.Process

A WorkChain, the base class for AiiDA workflows.

class Context(value=None)[source]

Bases: object

__delattr__(item)[source]

x.__delattr__(‘name’) <==> del x.name

__delitem__(key)[source]
__dict__ = dict_proxy({'__delitem__': <function __delitem__>, '__module__': 'aiida.work.workchain', 'setdefault': <function setdefault>, '__getitem__': <function __getitem__>, 'get': <function get>, '__setattr__': <function __setattr__>, '__getattr__': <function __getattr__>, '__dict__': <attribute '__dict__' of 'Context' objects>, '__delattr__': <function __delattr__>, '__iter__': <function __iter__>, '__init__': <function __init__>, 'save_instance_state': <function save_instance_state>, '__setitem__': <function __setitem__>, '_get_dict': <function _get_dict>, '__weakref__': <attribute '__weakref__' of 'Context' objects>, '__doc__': None, '__dir__': <function __dir__>})
__dir__()[source]
__getattr__(name)[source]
__getitem__(item)[source]
__init__(value=None)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__iter__()[source]
__module__ = 'aiida.work.workchain'
__setattr__(name, value)[source]

x.__setattr__(‘name’, value) <==> x.name = value

__setitem__(key, value)[source]
__weakref__

list of weak references to the object (if defined)

_get_dict()[source]
get(key, default=None)[source]
save_instance_state(out_state)[source]
setdefault(key, default=None)[source]
_ABORTED = 'aborted'
_BARRIERS = 'barriers'
_CONTEXT = 'context'
_INTERSTEPS = 'intersteps'
_STEPPER_STATE = 'stepper_state'
__abstractmethods__ = frozenset([])
__init__()[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.workchain'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
_aborted
_do_abort
_do_step(wait_on=None)[source]
_handle_do_abort()[source]

Check whether a request to abort has been registered, by checking whether the DO_ABORT_KEY attribute has been set, and if so call self.abort and remove the DO_ABORT_KEY attribute

_run(**kwargs)[source]
_spec_type

alias of _WorkChainSpec

abort(msg=None, timeout=None)[source]

Abort the workchain by calling the abort method of the Process and also adding the abort message to the report

Parameters:
  • msg (str) – The abort message
  • timeout (float) – Wait for the given time until the process has aborted
Returns:

True if the process is aborted at the end of the function, False otherwise

abort_nowait(msg=None)[source]

Abort the workchain at the next state transition without waiting which is achieved by passing a timeout value of zero

Parameters:msg (str) – The abort message
ctx
classmethod define(spec)[source]
insert_barrier(wait_on)[source]

Insert a barrier that will cause the workchain to wait until the wait on is finished before continuing to the next step.

Parameters:wait_on – The thing to wait on (of type plum.wait.wait_on)
insert_intersteps(intersteps)[source]

Insert an interstep to be executed after the current step ends but before the next step ends

Parameters:interstep – class:Interstep
on_create(pid, inputs, saved_state)[source]

Called when the process is created. If a checkpoint is supplied the process should reinstate its state at the time the checkpoint was taken and if the checkpoint has a wait_on the process will continue from the corresponding callback function.

Parameters:inputs – The inputs the process should run using.
remove_barrier(wait_on)[source]

Remove a barrier.

Precondition: must be a barrier that was previously inserted

Parameters:wait_on – The wait on to remove (of type plum.wait.wait_on)
save_instance_state(out_state)[source]
to_context(**kwargs)[source]

This is a convenience method that provides syntactic sugar, for a user to add multiple intersteps that will assign a certain value to the corresponding key in the context of the workchain

class aiida.work.workchain._Block(commands)[source]

Bases: aiida.work.workchain._Instruction

Represents a block of instructions i.e. a sequential list of instructions.

class Stepper(workflow, commands)[source]

Bases: aiida.work.workchain.Stepper

_POSITION = 'pos'
_STEPPER_POS = 'stepper_pos'
__abstractmethods__ = frozenset([])
__init__(workflow, commands)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.workchain'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
load_position(bundle)[source]
save_position(out_position)[source]
step()[source]

Execute on step of the instructions.

Returns:A 2-tuple with entries 0. True if the stepper has finished, False otherwise 1. The return value from the executed step
Return type:tuple
__abstractmethods__ = frozenset([])
__init__(commands)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.workchain'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
create_stepper(workflow)[source]
get_description(indent_level=0, indent_increment=4)[source]

Get a text description of these instructions. :return: The description :rtype: str

class aiida.work.workchain._Conditional(parent, condition)[source]

Bases: object

Object that represents some condition with the corresponding body to be executed if the condition.

E.g.

if(condition):
  body

or:

while(condition):
  body
__call__(...) <==> x(...)[source]
__dict__ = dict_proxy({'body': <property object>, 'is_true': <function is_true>, '__module__': 'aiida.work.workchain', 'condition': <property object>, '__dict__': <attribute '__dict__' of '_Conditional' objects>, '__call__': <function __call__>, '__weakref__': <attribute '__weakref__' of '_Conditional' objects>, '__doc__': '\n Object that represents some condition with the corresponding body to be\n executed if the condition.\n \n E.g. ::\n\n if(condition):\n body\n\n or::\n\n while(condition):\n body\n ', '__init__': <function __init__>})
__init__(parent, condition)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.workchain'
__weakref__

list of weak references to the object (if defined)

body
condition
is_true(workflow)[source]
class aiida.work.workchain._If(condition)[source]

Bases: aiida.work.workchain._Instruction

class Stepper(workflow, if_spec)[source]

Bases: aiida.work.workchain.Stepper

_POSITION = 'pos'
_STEPPER_POS = 'stepper_pos'
__abstractmethods__ = frozenset([])
__init__(workflow, if_spec)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.workchain'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
_create_stepper()[source]
load_position(bundle)[source]
save_position(out_position)[source]
step()[source]

Execute on step of the instructions.

Returns:A 2-tuple with entries 0. True if the stepper has finished, False otherwise 1. The return value from the executed step
Return type:tuple
__abstractmethods__ = frozenset([])
__call__(*commands)[source]

This is how the commands for the if(…) body are set :param commands: The commands to run on the original if. :return: This instance.

__init__(condition)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.workchain'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
conditionals
create_stepper(workflow)[source]
elif_(condition)[source]
else_(*commands)[source]
get_description()[source]

Get a text description of these instructions. :return: The description :rtype: str

class aiida.work.workchain._Instruction[source]

Bases: object

This class represents an instruction in a a workchain. To step through the step you need to get a stepper by calling create_stepper() from which you can call the step() method.

__abstractmethods__ = frozenset(['get_description', 'create_stepper'])
__dict__ = dict_proxy({'__module__': 'aiida.work.workchain', 'check_command': <staticmethod object>, '__metaclass__': <class 'abc.ABCMeta'>, '_abc_negative_cache': <_weakrefset.WeakSet object>, '__str__': <function __str__>, 'get_description': <function get_description>, '__dict__': <attribute '__dict__' of '_Instruction' objects>, '__weakref__': <attribute '__weakref__' of '_Instruction' objects>, '_abc_cache': <_weakrefset.WeakSet object>, 'create_stepper': <function create_stepper>, '__abstractmethods__': frozenset(['get_description', 'create_stepper']), '_abc_negative_cache_version': 94, '_abc_registry': <_weakrefset.WeakSet object>, '__doc__': '\n This class represents an instruction in a a workchain. To step through the\n step you need to get a stepper by calling ``create_stepper()`` from which\n you can call the :class:`~Stepper.step()` method.\n '})
__metaclass__

alias of abc.ABCMeta

__module__ = 'aiida.work.workchain'
__str__() <==> str(x)[source]
__weakref__

list of weak references to the object (if defined)

_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
static check_command()[source]
create_stepper(workflow)[source]
get_description()[source]

Get a text description of these instructions. :return: The description :rtype: str

class aiida.work.workchain._InterstepFactory[source]

Bases: object

Factory to create the appropriate Interstep instance based on the class string that was written to the bundle

__dict__ = dict_proxy({'__dict__': <attribute '__dict__' of '_InterstepFactory' objects>, '__module__': 'aiida.work.workchain', '__weakref__': <attribute '__weakref__' of '_InterstepFactory' objects>, 'create': <function create>, '__doc__': '\n Factory to create the appropriate Interstep instance based\n on the class string that was written to the bundle\n '})
__module__ = 'aiida.work.workchain'
__weakref__

list of weak references to the object (if defined)

create(bundle)[source]
exception aiida.work.workchain._PropagateReturn[source]

Bases: exceptions.BaseException

__module__ = 'aiida.work.workchain'
__weakref__

list of weak references to the object (if defined)

class aiida.work.workchain._Return[source]

Bases: aiida.work.workchain._Instruction

A return instruction to tell the workchain to stop stepping through the outline and cease execution immediately.

__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.workchain'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
create_stepper(workflow)[source]
get_description()[source]

Get a text description of these instructions. :return: The description :rtype: str

class aiida.work.workchain._ReturnStepper(workflow)[source]

Bases: aiida.work.workchain.Stepper

__abstractmethods__ = frozenset([])
__module__ = 'aiida.work.workchain'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
load_position(bundle)[source]

Nothing to be done: no internal state. :param bundle: :return:

save_position(out_position)[source]
step()[source]

Execute on step of the instructions.

Returns:A 2-tuple with entries: 0. True if the stepper has finished, False otherwise 1. The return value from the executed step
Return type:tuple
class aiida.work.workchain._While(condition)[source]

Bases: aiida.work.workchain._Conditional, aiida.work.workchain._Instruction

class Stepper(workflow, while_spec)[source]

Bases: aiida.work.workchain.Stepper

_CHECK_CONDITION = 'check_condition'
_FINISHED = 'finished'
_STEPPER_POS = 'stepper_pos'
__abstractmethods__ = frozenset([])
__init__(workflow, while_spec)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.workchain'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
_body_stepper
load_position(bundle)[source]
save_position(out_position)[source]
step()[source]

Execute on step of the instructions.

Returns:A 2-tuple with entries 0. True if the stepper has finished, False otherwise 1. The return value from the executed step
Return type:tuple
__abstractmethods__ = frozenset([])
__init__(condition)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.workchain'
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 94
_abc_registry = <_weakrefset.WeakSet object>
create_stepper(workflow)[source]
get_description()[source]

Get a text description of these instructions. :return: The description :rtype: str

class aiida.work.workchain._WorkChainSpec[source]

Bases: aiida.work.process.ProcessSpec

__init__()[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.work.workchain'
get_description()[source]

Get a text description of this process specification.

Returns:A text description
Return type:str
get_outline()[source]
outline(*commands)[source]

Define the outline that describes this work chain.

Parameters:commands – One or more functions that make up this work chain.
aiida.work.workchain.if_(condition)[source]

A conditional that can be used in a workchain outline.

Use as:

if_(cls.conditional)(
  cls.step1,
  cls.step2
)

Each step can, of course, also be any valid workchain step e.g. conditional.

Parameters:condition – The workchain method that will return True or False
aiida.work.workchain.while_(condition)[source]

A while loop that can be used in a workchain outline.

Use as:

while_(cls.conditional)(
  cls.step1,
  cls.step2
)

Each step can, of course, also be any valid workchain step e.g. conditional.

Parameters:condition – The workchain method that will return True or False
aiida.work.workfunction.workfunction(func)[source]

A decorator to turn a standard python function into a workfunction. Example usage:

>>> from aiida.orm.data.base import Int
>>> from aiida.work.workfunction import workfunction as wf
>>>
>>> # Define the workfunction
>>> @wf
>>> def sum(a, b):
>>>    return a + b
>>> # Run it with some input
>>> r = sum(Int(4), Int(5))
>>> print(r)
9
>>> r.get_inputs_dict() 
{u'_return': <WorkCalculation: uuid: ce0c63b3-1c84-4bb8-ba64-7b70a36adf34 (pk: 3567)>}
>>> r.get_inputs_dict()['_return'].get_inputs()
[4, 5]