aiida.storage.sqlite_temp package#
A temporary backend, using an in-memory sqlite database.
This backend is intended for testing and demonstration purposes. Whenever it is instantiated, it creates a fresh storage backend, and destroys it when it is garbage collected.
Submodules#
Definition of the SqliteTempBackend
backend.
- class aiida.storage.sqlite_temp.backend.SandboxShaRepositoryBackend(filepath: str | None = None)[source]#
Bases:
SandboxRepositoryBackend
A sandbox repository backend that uses the sha256 of the file as the key.
This allows for compatibility with the archive format (i.e. SqliteZipBackend). Which allows for temporary profiles to be exported and imported.
- __abstractmethods__ = frozenset({})#
- __module__ = 'aiida.storage.sqlite_temp.backend'#
- _abc_impl = <_abc._abc_data object>#
- _put_object_from_filelike(handle: BinaryIO) str [source]#
Store the byte contents of a file in the repository.
- Parameters:
handle – filelike object with the byte content to be stored.
- Returns:
the generated fully qualified identifier for the object within the repository.
- Raises:
TypeError – if the handle is not a byte stream.
- _sandbox: SandboxFolder | None#
- get_info(detailed: bool = False, **kwargs) dict [source]#
Returns relevant information about the content of the repository.
- Parameters:
detailed – flag to enable extra information (detailed=False by default, only returns basic information).
- Returns:
a dictionary with the information.
- get_object_hash(key: str) str [source]#
Return the SHA-256 hash of an object stored under the given key.
Important
A SHA-256 hash should always be returned, to ensure consistency across different repository implementations.
- Parameters:
key – fully qualified identifier for the object within the repository.
- Raises:
FileNotFoundError – if the file does not exist.
OSError – if the file could not be opened.
- property key_format: str | None#
Return the format for the keys of the repository.
Important for when migrating between backends (e.g. archive -> main), as if they are not equal then it is necessary to re-compute all the Node.base.repository.metadata before importing (otherwise they will not match with the repository).
- maintain(dry_run: bool = False, live: bool = True, **kwargs) None [source]#
Performs maintenance operations.
- Parameters:
dry_run – flag to only print the actions that would be taken without actually executing them.
live – flag to indicate to the backend whether AiiDA is live or not (i.e. if the profile of the backend is currently being used/accessed). The backend is expected then to only allow (and thus set by default) the operations that are safe to perform in this state.
- class aiida.storage.sqlite_temp.backend.SqliteTempBackend(profile: Profile)[source]#
Bases:
StorageBackend
A temporary backend, using an in-memory sqlite database.
This backend is intended for testing and demonstration purposes. Whenever it is instantiated, it creates a fresh storage backend, and destroys it when it is garbage collected.
- class Model(**data: Any)[source]#
Bases:
BaseModel
- __abstractmethods__ = frozenset({})#
- __annotations__ = {'__class_vars__': 'ClassVar[set[str]]', '__private_attributes__': 'ClassVar[Dict[str, ModelPrivateAttr]]', '__pydantic_complete__': 'ClassVar[bool]', '__pydantic_core_schema__': 'ClassVar[CoreSchema]', '__pydantic_custom_init__': 'ClassVar[bool]', '__pydantic_decorators__': 'ClassVar[_decorators.DecoratorInfos]', '__pydantic_extra__': 'dict[str, Any] | None', '__pydantic_fields_set__': 'set[str]', '__pydantic_generic_metadata__': 'ClassVar[_generics.PydanticGenericMetadata]', '__pydantic_parent_namespace__': 'ClassVar[Dict[str, Any] | None]', '__pydantic_post_init__': "ClassVar[None | Literal['model_post_init']]", '__pydantic_private__': 'dict[str, Any] | None', '__pydantic_root_model__': 'ClassVar[bool]', '__pydantic_serializer__': 'ClassVar[SchemaSerializer]', '__pydantic_validator__': 'ClassVar[SchemaValidator | PluggableSchemaValidator]', '__signature__': 'ClassVar[Signature]', 'filepath': 'str', 'model_computed_fields': 'ClassVar[Dict[str, ComputedFieldInfo]]', 'model_config': 'ClassVar[ConfigDict]', 'model_fields': 'ClassVar[Dict[str, FieldInfo]]'}#
- __dict__#
- __module__ = 'aiida.storage.sqlite_temp.backend'#
- __private_attributes__: ClassVar[Dict[str, ModelPrivateAttr]] = {}#
Metadata about the private attributes of the model.
- __pydantic_complete__: ClassVar[bool] = True#
Whether model building is completed, or if there are still undefined fields.
- __pydantic_core_schema__: ClassVar[CoreSchema] = {'cls': <class 'aiida.storage.sqlite_temp.backend.SqliteTempBackend.Model'>, 'config': {'title': 'Model'}, 'custom_init': False, 'metadata': {'pydantic_js_annotation_functions': [], 'pydantic_js_functions': [functools.partial(<function modify_model_json_schema>, cls=<class 'aiida.storage.sqlite_temp.backend.SqliteTempBackend.Model'>, title=None), <bound method BaseModel.__get_pydantic_json_schema__ of <class 'aiida.storage.sqlite_temp.backend.SqliteTempBackend.Model'>>]}, 'ref': 'aiida.storage.sqlite_temp.backend.SqliteTempBackend.Model:94480932208192', 'root_model': False, 'schema': {'computed_fields': [], 'fields': {'filepath': {'metadata': {'pydantic_js_annotation_functions': [<function get_json_schema_update_func.<locals>.json_schema_update_func>], 'pydantic_js_functions': []}, 'schema': {'default_factory': <function mkdtemp>, 'schema': {'type': 'str'}, 'type': 'default'}, 'type': 'model-field'}}, 'model_name': 'Model', 'type': 'model-fields'}, 'type': 'model'}#
The core schema of the model.
- __pydantic_decorators__: ClassVar[_decorators.DecoratorInfos] = DecoratorInfos(validators={}, field_validators={}, root_validators={}, field_serializers={}, model_serializers={}, model_validators={}, computed_fields={})#
Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.
- __pydantic_extra__: dict[str, Any] | None#
A dictionary containing extra values, if [extra][pydantic.config.ConfigDict.extra] is set to ‘allow’.
- __pydantic_generic_metadata__: ClassVar[_generics.PydanticGenericMetadata] = {'args': (), 'origin': None, 'parameters': ()}#
Metadata for generic models; contains data used for a similar purpose to __args__, __origin__, __parameters__ in typing-module generics. May eventually be replaced by these.
- __pydantic_parent_namespace__: ClassVar[Dict[str, Any] | None] = {'__doc__': 'A temporary backend, using an in-memory sqlite database.\n\n This backend is intended for testing and demonstration purposes.\n Whenever it is instantiated, it creates a fresh storage backend,\n and destroys it when it is garbage collected.\n ', '__module__': 'aiida.storage.sqlite_temp.backend', '__qualname__': 'SqliteTempBackend', 'subject': <pydantic._internal._mock_val_ser.MockValSer object>}#
Parent namespace of the model, used for automatic rebuilding of models.
- __pydantic_post_init__: ClassVar[None | Literal['model_post_init']] = None#
The name of the post-init method for the model, if defined.
- __pydantic_private__: dict[str, Any] | None#
Values of private attributes set on the model instance.
- __pydantic_serializer__: ClassVar[SchemaSerializer] = SchemaSerializer(serializer=Model( ModelSerializer { class: Py( 0x000055ee0e66fe40, ), serializer: Fields( GeneralFieldsSerializer { fields: { "filepath": SerField { key_py: Py( 0x00007fc247e0d0a0, ), alias: None, alias_py: None, serializer: Some( WithDefault( WithDefaultSerializer { default: DefaultFactory( Py( 0x00007fc246ba3ba0, ), ), serializer: Str( StrSerializer, ), }, ), ), required: true, }, }, computed_fields: Some( ComputedFields( [], ), ), mode: SimpleDict, extra_serializer: None, filter: SchemaFilter { include: None, exclude: None, }, required_fields: 1, }, ), has_extra: false, root_model: false, name: "Model", }, ), definitions=[])#
The pydantic-core SchemaSerializer used to dump instances of the model.
- __pydantic_validator__: ClassVar[SchemaValidator | PluggableSchemaValidator] = SchemaValidator(title="Model", validator=Model( ModelValidator { revalidate: Never, validator: ModelFields( ModelFieldsValidator { fields: [ Field { name: "filepath", lookup_key: Simple { key: "filepath", py_key: Py( 0x00007fc20be050b0, ), path: LookupPath( [ S( "filepath", Py( 0x00007fc20d2d9df0, ), ), ], ), }, name_py: Py( 0x00007fc247e0d0a0, ), validator: WithDefault( WithDefaultValidator { default: DefaultFactory( Py( 0x00007fc246ba3ba0, ), ), on_error: Raise, validator: Str( StrValidator { strict: false, coerce_numbers_to_str: false, }, ), validate_default: false, copy_default: false, name: "default[str]", undefined: Py( 0x00007fc24570ee20, ), }, ), frozen: false, }, ], model_name: "Model", extra_behavior: Ignore, extras_validator: None, strict: false, from_attributes: false, loc_by_alias: true, }, ), class: Py( 0x000055ee0e66fe40, ), post_init: None, frozen: false, custom_init: false, root_model: false, undefined: Py( 0x00007fc24570ee20, ), name: "Model", }, ), definitions=[], cache_strings=True)#
The pydantic-core SchemaValidator used to validate instances of the model.
- __weakref__#
list of weak references to the object
- _abc_impl = <_abc._abc_data object>#
- model_computed_fields: ClassVar[Dict[str, ComputedFieldInfo]] = {}#
A dictionary of computed field names and their corresponding ComputedFieldInfo objects.
- model_config: ClassVar[ConfigDict] = {'defer_build': True}#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[Dict[str, FieldInfo]] = {'filepath': FieldInfo(annotation=str, required=False, default_factory=mkdtemp, title='Temporary directory', description='Temporary directory in which to store data for this backend.')}#
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo] objects.
This replaces Model.__fields__ from Pydantic V1.
- __abstractmethods__ = frozenset({})#
- __init__(profile: Profile)[source]#
Initialize the backend, for this profile.
- Raises:
~aiida.common.exceptions.UnreachableStorage if the storage cannot be accessed
- Raises:
~aiida.common.exceptions.IncompatibleStorageSchema if the profile’s storage schema is not at the latest version (and thus should be migrated)
- Raises:
- raises:
aiida.common.exceptions.CorruptStorage
if the storage is internally inconsistent
- __module__ = 'aiida.storage.sqlite_temp.backend'#
- _abc_impl = <_abc._abc_data object>#
- _clear() None [source]#
Clear the storage, removing all data.
Warning
This is a destructive operation, and should only be used for testing purposes.
- static _get_mapper_from_entity(entity_type: EntityTypes, with_pk: bool)[source]#
Return the Sqlalchemy mapper and fields corresponding to the given entity.
- Parameters:
with_pk – if True, the fields returned will include the primary key
- property authinfos#
Return the collection of authorisation information objects
- bulk_insert(entity_type: EntityTypes, rows: list[dict], allow_defaults: bool = False) list[int] [source]#
Insert a list of entities into the database, directly into a backend transaction.
- Parameters:
entity_type – The type of the entity
data – A list of dictionaries, containing all fields of the backend model, except the id field (a.k.a primary key), which will be generated dynamically
allow_defaults – If
False
, assert that each row contains all fields (except primary key(s)), otherwise, allow default values for missing fields.
- Raises:
IntegrityError
if the keys in a row are not a subset of the columns in the table- Returns:
The list of generated primary keys for the entities
- bulk_update(entity_type: EntityTypes, rows: list[dict]) None [source]#
Update a list of entities in the database, directly with a backend transaction.
- Parameters:
entity_type – The type of the entity
data – A list of dictionaries, containing fields of the backend model to update, and the id field (a.k.a primary key)
- Raises:
IntegrityError
if the keys in a row are not a subset of the columns in the table
- cli_exposed = False#
Ensure this plugin is not exposed in
verdi profile setup
.
- property comments#
Return the collection of comments
- property computers#
Return the collection of computers
- static create_profile(name: str = 'temp', default_user_email='user@email.com', options: dict | None = None, debug: bool = False, filepath: str | Path | None = None) Profile [source]#
Create a new profile instance for this backend, from the path to the zip file.
- delete_nodes_and_connections(pks_to_delete: Sequence[int])[source]#
Delete all nodes corresponding to pks in the input and any links to/from them.
This method is intended to be used within a transaction context.
- Parameters:
pks_to_delete – a sequence of node pks to delete
- Raises:
AssertionError
if a transaction is not active
- get_backend_entity(model) BackendEntity [source]#
Return the backend entity that corresponds to the given Model instance.
- get_global_variable(key: str)[source]#
Return a global variable from the storage.
- Parameters:
key – the key of the setting
- Raises:
KeyError if the setting does not exist
- get_info(detailed: bool = False) dict [source]#
Return general information on the storage.
- Parameters:
detailed – flag to request more detailed information about the content of the storage.
- Returns:
a nested dict with the relevant information.
- get_repository() SandboxShaRepositoryBackend [source]#
Return the object repository configured for this backend.
- property groups#
Return the collection of groups
- classmethod initialise(profile: Profile, reset: bool = False) bool [source]#
Initialise the storage backend.
This is typically used once when a new storage backed is created. If this method returns without exceptions the storage backend is ready for use. If the backend already seems initialised, this method is a no-op.
- Parameters:
reset – If
true
, destroy the backend if it already exists including all of its data before recreating and initialising it. This is useful for example for test profiles that need to be reset before or after tests having run.- Returns:
True
if the storage was initialised by the function call,False
if it was already initialised.
- property logs#
Return the collection of logs
- maintain(dry_run: bool = False, live: bool = True, **kwargs) None [source]#
Perform maintenance tasks on the storage.
If full == True, then this method may attempt to block the profile associated with the storage to guarantee the safety of its procedures. This will not only prevent any other subsequent process from accessing that profile, but will also first check if there is already any process using it and raise if that is the case. The user will have to manually stop any processes that is currently accessing the profile themselves or wait for it to finish on its own.
- Parameters:
full – flag to perform operations that require to stop using the profile to be maintained.
dry_run – flag to only print the actions that would be taken without actually executing them.
- classmethod migrate(profile: Profile)[source]#
Migrate the storage of a profile to the latest schema version.
If the schema version is already the latest version, this method does nothing. If the storage is uninitialised, this method will raise an exception.
- Raises:
:class`~aiida.common.exceptions.UnreachableStorage` if the storage cannot be accessed.
- Raises:
StorageMigrationError
if the storage is not initialised.
- property nodes#
Return the collection of nodes
- query() SqliteQueryBuilder [source]#
Return an instance of a query builder implementation for this backend
- set_global_variable(key: str, value, description: str | None = None, overwrite=True) None [source]#
Set a global variable in the storage.
- Parameters:
key – the key of the setting
value – the value of the setting
description – the description of the setting (optional)
overwrite – if True, overwrite the setting if it already exists
- Raises:
ValueError if the key already exists and overwrite is False
- transaction() Iterator[Session] [source]#
Open a transaction to be used as a context manager.
If there is an exception within the context then the changes will be rolled back and the state will be as before entering. Transactions can be nested.
- property users#
Return the collection of users