Multiscale
Dataset #
Bases: FrozenBase
A single entry in the multiscales.datasets
list.
See https://ngff.openmicroscopy.org/0.4/#multiscale-md for the specification of this data structure.
Attributes:
Name | Type | Description |
---|---|---|
path |
str
|
The path to the Zarr array that stores the image described by this metadata. This path should be relative to the group that contains this metadata. |
coordinateTransformations |
ScaleTransform | TranslationTransform
|
The coordinate transformations for this image. |
MultiscaleMetadata #
Bases: VersionedBase
, FrozenBase
, SkipNoneBase
Multiscale image metadata.
See https://ngff.openmicroscopy.org/0.4/#multiscale-md for the specification of this data structure.
Attributes:
Name | Type | Description |
---|---|---|
name |
Any, default = `None`
|
The name for this multiscale image. Optional. Defaults to |
type |
Any, default = `None`
|
The type of the multiscale image. Optional. Defaults to |
metadata |
Dict[str, Any] | None, default = `None`
|
Metadata for this multiscale image. Optional. Defaults to |
datasets |
tuple[Dataset, ...]
|
A collection of descriptions of arrays that collectively comprise this multiscale image. |
axes |
tuple[Axis, ...]
|
A tuple of |
coordinateTransformations |
tuple[tx.Scale] | tuple[tx.Scale, tx.Translation] | None. Defaults to `None`.
|
Coordinate transformations that express a scaling and translation shared by all elements of
|
validate_transforms #
Ensure that the dimensionality of the top-level coordinateTransformations, if present, is consistent with the rest of the model.
Source code in src/pydantic_ome_ngff/v04/multiscale.py
MultiscaleGroupAttrs #
Bases: BaseModel
A model of the required attributes of a Zarr group that implements OME-NGFF Multiscales metadata.
See https://ngff.openmicroscopy.org/0.4/#multiscale-md for the specification of this data structure.
Attributes:
Name | Type | Description |
---|---|---|
multiscales |
tuple[MultiscaleMetadata]
|
A list of |
MultiscaleGroup #
Bases: GroupSpec[MultiscaleGroupAttrs, ArraySpec | GroupSpec]
A model of a Zarr group that implements OME-NGFF Multiscales metadata.
See https://ngff.openmicroscopy.org/0.4/#multiscale-md for the specification of this data structure.
Attributes:
Name | Type | Description |
---|---|---|
attributes |
GroupAttrs
|
The attributes of this Zarr group, which should contain valid |
members |
Dict[Str, ArraySpec | GroupSpec]:
|
The members of this Zarr group. Should be instances of |
from_zarr
classmethod
#
Create an instance of Group
from a node
, a zarr.Group
. This method discovers Zarr arrays in the hierarchy rooted at node
by inspecting the OME-NGFF
multiscales metadata.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
node |
Group
|
A Zarr group that has valid OME-NGFF multiscale metadata. |
required |
Returns:
Type | Description |
---|---|
Group
|
A model of the Zarr group. |
Source code in src/pydantic_ome_ngff/v04/multiscale.py
from_arrays
classmethod
#
from_arrays(arrays, *, paths, axes, scales, translations, name=None, type=None, metadata=None, chunks='auto', compressor=DEFAULT_COMPRESSOR, fill_value=0, order='auto')
Create a MultiscaleGroup
from a sequence of multiscale arrays and spatial metadata.
The arrays are used as templates for corresponding ArraySpec
instances, which model the Zarr arrays that would be created if the MultiscaleGroup
was stored.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
paths |
Sequence[str]
|
The paths to the arrays. |
required |
axes |
Sequence[Axis]
|
|
required |
arrays |
Sequence[ArrayLike | ChunkedArrayLike]
|
A sequence of array-like objects that collectively represent the same image
at multiple levels of detail. The attributes of these arrays are used to create |
required |
scales |
Sequence[tuple[int | float, ...]]
|
A scale value for each axis of the array, for each array in |
required |
translations |
Sequence[tuple[int | float, ...]]
|
A translation value for each axis the array, for each array in |
required |
name |
str | None
|
A name for the multiscale collection. Optional. |
None
|
type |
str | None
|
A description of the type of multiscale image represented by this group. Optional. |
None
|
metadata |
dict[str, Any] | None
|
Arbitrary metadata associated with this multiscale collection. Optional. |
None
|
chunks |
tuple[int, ...] | tuple[tuple[int, ...], ...] | Literal['auto']
|
The chunks for the arrays in this multiscale group. If the string "auto" is provided, each array will have chunks set to the zarr-python default value, which depends on the shape and dtype of the array. If a single sequence of ints is provided, then this defines the chunks for all arrays. If a sequence of sequences of ints is provided, then this defines the chunks for each array. |
'auto'
|
fill_value |
Any
|
The fill value for the Zarr arrays. |
0
|
compressor |
Codec | Literal['auto']
|
The compressor to use for the arrays. Default is |
DEFAULT_COMPRESSOR
|
order |
Literal['C', 'F', 'auto']
|
The memory layout used for chunks of Zarr arrays. The default is "auto", which will infer the order from the input arrays, and fall back to "C" if that inference fails. |
'auto'
|
Source code in src/pydantic_ome_ngff/v04/multiscale.py
348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 |
|
from_array_props
classmethod
#
from_array_props(dtype, shapes, paths, axes, scales, translations, name=None, type=None, metadata=None, chunks='auto', compressor=DEFAULT_COMPRESSOR, fill_value=0, order='C')
Create a MultiscaleGroup
from a dtype and a sequence of shapes.
The dtype and shapes are used to parametrize ArraySpec
instances which model the Zarr arrays that would be created if the MultiscaleGroup
was stored.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
dtype |
DTypeLike
|
The data type of the arrays. |
required |
shapes |
Sequence[Sequence[int]]
|
The shapes of the arrays. |
required |
paths |
Sequence[str]
|
The paths to the arrays. |
required |
axes |
Sequence[Axis]
|
|
required |
scales |
Sequence[tuple[int | float, ...]]
|
A scale value for each axis of the array, for each shape in |
required |
translations |
Sequence[tuple[int | float, ...]]
|
A translation value for each axis the array, for each shape in |
required |
name |
str | None
|
A name for the multiscale collection. Optional. |
None
|
type |
str | None
|
A description of the type of multiscale image represented by this group. Optional. |
None
|
metadata |
dict[str, Any] | None
|
Arbitrary metadata associated with this multiscale collection. Optional. |
None
|
chunks |
tuple[int, ...] | tuple[tuple[int, ...], ...] | Literal['auto']
|
The chunks for the arrays in this multiscale group. If the string "auto" is provided, each array will have chunks set to the zarr-python default value, which depends on the shape and dtype of the array. If a single sequence of ints is provided, then this defines the chunks for all arrays. If a sequence of sequences of ints is provided, then this defines the chunks for each array. |
'auto'
|
fill_value |
Any
|
The fill value for the Zarr arrays. |
0
|
compressor |
Codec
|
The compressor to use for the arrays. Default is |
DEFAULT_COMPRESSOR
|
order |
Literal['C', 'F']
|
The memory layout used for chunks of Zarr arrays. The default is "C". |
'C'
|
Source code in src/pydantic_ome_ngff/v04/multiscale.py
440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 |
|
check_arrays_exist #
Check that the arrays referenced in the multiscales
metadata are actually contained in this group.
Source code in src/pydantic_ome_ngff/v04/multiscale.py
check_array_ndim #
Check that all the arrays referenced by the multiscales
metadata have dimensionality consistent with the
coordinateTransformations
metadata.
Source code in src/pydantic_ome_ngff/v04/multiscale.py
ensure_scale_translation #
Ensures that the first element is a scale transformation, the second element, if present, is a translation transform, and that there are only 1 or 2 transforms.
Source code in src/pydantic_ome_ngff/v04/multiscale.py
create_dataset #
Create a Dataset
from a path, a scale, and a translation. This metadata models a Zarr array that partially comprises a multiscale group.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
path |
str
|
The path, relative to the multiscale group, of the Zarr array. |
required |
scale |
Sequence[int | float]
|
The scale parameter for data stored in the Zarr array. This should define the spacing between elements of the coordinate grid of the data. |
required |
translation |
Sequence[int | float]
|
The translation parameter for data stored in the Zarr array. This should define the origin of the coordinate grid of the data. |
required |
Returns:
Type | Description |
---|---|
`Dataset`
|
|
Source code in src/pydantic_ome_ngff/v04/multiscale.py
ensure_axis_length #
Ensures that there are between 2 and 5 axes (inclusive)
Source code in src/pydantic_ome_ngff/v04/multiscale.py
ensure_axis_names #
Ensures that the names of the axes are unique.
Source code in src/pydantic_ome_ngff/v04/multiscale.py
ensure_axis_types #
Ensures that the following conditions are true:
- there are only 2 or 3 axes with type
space
- the axes with type
space
are last in the list of axes - there is only 1 axis with type
time
- there is only 1 axis with type
channel
- there is only 1 axis with a type that is not
space
,time
, orchannel
Source code in src/pydantic_ome_ngff/v04/multiscale.py
normalize_chunks #
If chunks is "auto", then use zarr default chunking based on the largest array for all the arrays. If chunks is a sequence of ints, then use those chunks for all arrays. If chunks is a sequence of sequences of ints, then use those chunks for each array.