Pydantic#

Example Output#

personinfo_pydantic.py

Overview#

The Pydantic Generator produces Pydantic flavored python dataclasses from a linkml model, with optional support for user-supplied jinja2 templates to generate alternate classes.

Example#

Given a definition of a Person class:

Person:
  is_a: NamedThing
  description: >-
    A person (alive, dead, undead, or fictional).
  class_uri: schema:Person
  mixins:
    - HasAliases
  slots:
    - primary_email
    - birth_date
    - age_in_years
    - gender
    - current_address
    - has_employment_history
    - has_familial_relationships
    - has_medical_history

(some details omitted for brevity, including slot definitions and parent classes)

The generate python looks like this:

class Person(NamedThing):
    """
    A person (alive, dead, undead, or fictional).
    """
    primary_email: Optional[str] = Field(None)
    birth_date: Optional[str] = Field(None)
    age_in_years: Optional[int] = Field(None, ge=0, le=999)
    gender: Optional[GenderType] = Field(None)
    current_address: Optional[Address] = Field(None, description="""The address at which a person currently lives""")
    has_employment_history: Optional[List[EmploymentEvent]] = Field(None)
    has_familial_relationships: Optional[List[FamilialRelationship]] = Field(None)
    has_medical_history: Optional[List[MedicalEvent]] = Field(None)
    aliases: Optional[List[str]] = Field(None)
    id: Optional[str] = Field(None)
    name: Optional[str] = Field(None)
    description: Optional[str] = Field(None)
    image: Optional[str] = Field(None)

Command Line#

gen-pydantic#

Generate pydantic classes to represent a LinkML model

gen-pydantic [OPTIONS] YAMLFILE

Options

-V, --version#

Show the version and exit.

--meta <meta>#

How to include linkml schema metadata in generated pydantic classes. See docs for MetadataMode for full description of choices. Default (auto) is to include all metadata that can’t be otherwise represented

Options:

MetadataMode.FULL | MetadataMode.EXCEPT_CHILDREN | MetadataMode.AUTO | MetadataMode.NONE

--black#

Format generated models with black (must be present in the environment)

--extra-fields <extra_fields>#

How to handle extra fields in BaseModel.

Options:

allow | ignore | forbid

--array-representations <array_representations>#

List of array representations to accept for array slots. Default is list of lists.

Options:

list | nparray

--template-dir <template_dir>#

Optional jinja2 template directory to use for class generation.

Pass a directory containing templates with the same name as any of the default TemplateModel templates to override them. The given directory will be searched for matching templates, and use the default templates as a fallback if an override is not found

Available templates to override:

- attribute.py.jinja
- base_model.py.jinja
- class.py.jinja
- conditional_import.py.jinja
- enum.py.jinja
- imports.py.jinja
- module.py.jinja
- validator.py.jinja
-f, --format <format>#

Output format

Default:

'pydantic'

Options:

pydantic

--metadata, --no-metadata#

Include metadata in output

Default:

True

--useuris, --metauris#

Use class and slot URIs over model uris

Default:

True

-im, --importmap <importmap>#

Import mapping file

--log_level <log_level>#

Logging level

Default:

'WARNING'

Options:

CRITICAL | ERROR | WARNING | INFO | DEBUG

-v, --verbose#

Verbosity. Takes precedence over –log_level.

--mergeimports, --no-mergeimports#

Merge imports into source file (default=mergeimports)

--stacktrace, --no-stacktrace#

Print a stack trace when an error occurs

Default:

False

Arguments

YAMLFILE#

Required argument

Generator#

class linkml.generators.pydanticgen.PydanticGenerator(schema: str | ~typing.TextIO | ~linkml_runtime.linkml_model.meta.SchemaDefinition | Generator | ~pathlib.Path, schemaview: ~linkml_runtime.utils.schemaview.SchemaView = None, format: str | None = None, metadata: bool = True, useuris: bool | None = None, log_level: int | None = 30, mergeimports: bool | None = True, source_file_date: str | None = None, source_file_size: int | None = None, logger: ~logging.Logger | None = None, verbose: bool | None = None, output: str | None = None, namespaces: ~linkml_runtime.utils.namespaces.Namespaces | None = None, directory_output: bool = False, base_dir: str = None, metamodel_name_map: ~typing.Dict[str, str] = None, importmap: str | ~typing.Mapping[str, str] | None = None, emit_prefixes: ~typing.Set[str] = <factory>, metamodel: ~linkml.utils.schemaloader.SchemaLoader = None, stacktrace: bool = False, include: str | ~pathlib.Path | ~linkml_runtime.linkml_model.meta.SchemaDefinition | None = None, template_file: str = None, package: str = 'example', array_representations: ~typing.List[~linkml.generators.pydanticgen.array.ArrayRepresentation] = <factory>, black: bool = False, template_dir: str | ~pathlib.Path | None = None, extra_fields: ~typing.Literal['allow', 'forbid', 'ignore'] = 'forbid', gen_mixin_inheritance: bool = True, injected_classes: ~typing.List[str | ~typing.Type] | None = None, injected_fields: ~typing.List[str] | None = None, imports: ~typing.List[~linkml.generators.pydanticgen.template.Import] | None = None, metadata_mode: ~linkml.generators.pydanticgen.pydanticgen.MetadataMode | str | None = MetadataMode.AUTO, split: bool = False, split_pattern: str = '.{{ schema.name }}', split_context: dict | None = None, split_mode: ~linkml.generators.pydanticgen.pydanticgen.SplitMode = SplitMode.AUTO, gen_classvars: bool = True, gen_slots: bool = True, genmeta: bool = False, emit_metadata: bool = True, _predefined_slot_values: ~typing.Dict[str, ~typing.Dict[str, str]] | None = None, _class_bases: ~typing.Dict[str, ~typing.List[str]] | None = None, **_kwargs)[source]#

Generates Pydantic-compliant classes from a schema

This is an alternative to the dataclasses-based Pythongen

SNAKE_CASE: ClassVar[str] = '(((?<!^)(?<!\\.))(?=[A-Z][a-z]))|([^\\w\\.]+)'#

Substitute CamelCase and non-word characters with _

black: bool = False#

If black is present in the environment, format the serialized code with it

property class_bases: Dict[str, List[str]]#

Generate the inheritance list for each class from is_a plus mixins :return:

compile_module(**kwargs) module[source]#

Compiles generated python code to a module :return:

generate_collection_key(slot_ranges: List[str], slot_def: SlotDefinition, class_def: ClassDefinition) str | None[source]#

Find the python range value (str, int, etc) for the identifier slot of a class used as a slot range.

If a pyrange value matches a class name, the range of the identifier slot will be returned. If more than one match is found and they don’t match, an exception will be raised.

Parameters:

slot_ranges – list of python range values

generate_module_import(schema: SchemaDefinition, context: dict | None = None) str[source]#

Generate the module string for importing from python modules generated from imported schemas when in split mode.

Use the split_pattern as a jinja template rendered with the SchemaDefinition and any passed context. Apply the SNAKE_CASE regex to substitute matches with _ and ensure lowercase.

generate_python_range(slot_range, slot_def: SlotDefinition, class_def: ClassDefinition) str[source]#

Generate the python range for a slot range value

classmethod generate_split(schema: str | Path | SchemaDefinition, output_path: str | Path = PosixPath('.'), split_pattern: str | None = None, split_context: dict | None = None, split_mode: SplitMode = SplitMode.AUTO, **kwargs) List[SplitResult][source]#

Generate a schema that imports from other schema as a set of python modules that import from one another, rather than generating all imported classes in a single schema.

Uses output_path for the main schema from schema , and then generates any imported schema (from which classes are actually used) to modules whose locations are determined by the module names generated by the split_pattern (see PydanticGenerator.split_pattern ).

For example, for

  • a output_path of my_dir/v1_2_3/main.py

  • a schema main with a version v1.2.3

  • that imports from s2 with version v4.5.6,

  • and a split_pattern of ..{{ schema.version | replace('.', '_') }}.{{ schema.name }}

One would get: * my_dir/v1_2_3/main.py , as expected * that imports from ..v4_5_6.s2 * a module at my_dir/v4_5_6/s2.py

__init__.py files are generated for any directories that are between the generated modules and their highest common directory.

Parameters:
  • schema (str, Path , SchemaDefinition) – Main schema to generate

  • output_path (str, Path) – Python .py module to generate main schema to

  • split_pattern (str) – Pattern to use to generate module names, see PydanticGenerator.split_pattern

  • split_context (dict) – Additional variables to pass into jinja context when generating module import names.

Returns:

list[SplitResult]

generatorname: ClassVar[str] = 'pydanticgen.py'#

Name of the generator. Override with os.path.basename(__file__)

generatorversion: ClassVar[str] = '0.0.2'#

Version of the generator. Consider deprecating and instead use overall linkml version

get_array_representations_range(slot: SlotDefinition, range: str) List[SlotResult][source]#

Generate the python range for array representations

imports: List[Import] | None = None#

Additional imports to inject into generated module.

Examples:

from linkml.generators.pydanticgen.template import (
    ConditionalImport,
    ObjectImport,
    Import,
    Imports
)

imports = (Imports() +
    Import(module='sys') +
    Import(module='numpy', alias='np') +
    Import(module='pathlib', objects=[
        ObjectImport(name="Path"),
        ObjectImport(name="PurePath", alias="RenamedPurePath")
    ]) +
    ConditionalImport(
        module="typing",
        objects=[ObjectImport(name="Literal")],
        condition="sys.version_info >= (3, 8)",
        alternative=Import(
            module="typing_extensions",
            objects=[ObjectImport(name="Literal")]
        ),
    ).imports
)

becomes:

import sys
import numpy as np
from pathlib import (
    Path,
    PurePath as RenamedPurePath
)
if sys.version_info >= (3, 8):
    from typing import Literal
else:
    from typing_extensions import Literal
include_metadata(model: PydanticModule, source: SchemaDefinition) PydanticModule[source]#
include_metadata(model: PydanticClass, source: ClassDefinition) PydanticClass
include_metadata(model: PydanticAttribute, source: SlotDefinition) PydanticAttribute

Include metadata from the source schema that is otherwise not represented in the pydantic template models.

Metadata inclusion mode is dependent on metadata_mode - see:

injected_classes: List[str | Type] | None = None#

A list/tuple of classes to inject into the generated module.

Accepts either live classes or strings. Live classes will have their source code extracted with inspect.get - so they need to be standard python classes declared in a source file (ie. the module they are contained in needs a __file__ attr, see: inspect.getsource() )

injected_fields: List[str] | None = None#

A list/tuple of field strings to inject into the base class.

Examples:

injected_fields = (
    'object_id: Optional[str] = Field(None, description="Unique UUID for each object")',
)
metadata_mode: MetadataMode | str | None = 'auto'#

How to include schema metadata in generated pydantic models.

See MetadataMode for mode documentation

property predefined_slot_values: Dict[str, Dict[str, str]]#

Dictionary of dictionaries with predefined slot values for each class

Type:

return

serialize(rendered_module: PydanticModule | None = None) str[source]#

Serialize the schema to a pydantic module as a string

Parameters:

rendered_module (PydanticModule) – Optional, if schema was previously rendered with render() , use that, otherwise render() fresh.

static sort_classes(clist: List[ClassDefinition], imported: List[ClassDefinition] | None = None) List[ClassDefinition][source]#

sort classes such that if C is a child of P then C appears after P in the list

Overridden method include mixin classes

TODO: This should move to SchemaView

split: bool = False#

Generate schema that import other schema as separate python modules that import from one another, rather than rolling all into a single module (default, False).

split_context: dict | None = None#

Additional variables to pass into split_pattern when generating imported module names.

Passed in as **kwargs , so e.g. if split_context = {'myval': 1} then one would use it in a template string like {{ myval }}

split_mode: SplitMode = 'auto'#

How to filter imports from imported schema.

See SplitMode for description of options

split_pattern: str = '.{{ schema.name }}'#

When splitting generation, imported modules need to be generated separately and placed in a python package and import from each other. Since the location of those imported modules is variable – e.g. one might want to generate schema in multiple packages depending on their version – this pattern is used to generate the module portion of the import statement.

These patterns should generally yield a relative module import, since functions like generate_split() will generate and write files relative to some base file, though this is not a requirement since custom split generation logic is also allowed.

The pattern is a jinja template string that is given the SchemaDefinition of the imported schema in the environment. Additional variables can be passed into the jinja environment with the split_context argument.

Further modification is possible by using jinja filters.

After templating, the string is passed through a SNAKE_CASE pattern to replace whitespace and other characters that can’t be used in module names.

See also generate_module_import(), which is used to generate the module portion of the import statement (and can be overridden in subclasses).

Examples

for a schema named ExampleSchema and version 1.2.3

".{{ schema.name }}" (the default) becomes

from .example_schema import ClassA, ...

"...{{ schema.name }}.v{{ schema.version | replace('.', '_') }}" becomes

from ...example_schema.v1_2_3 import ClassA, ...

template_dir: str | Path | None = None#

Override templates for each TemplateModel.

Directory with templates that override the default TemplateModel.template for each class. If a matching template is not found in the override directory, the default templates will be used.

valid_formats: ClassVar[List[str]] = ['pydantic']#

Allowed formats - first format is default

Split Generation#

Pydantic models can also be generated in a “split” mode where rather than rolling down all classes into a single file, schemas are kept as their own pydantic modules that import from one another.

See:

The implementation of split mode in the Generator itself still generates a single module, except for importing classes from other modules rather than including them directly. This is wrapped by PydanticGenerator.generate_split() which can be used to generate the module files directly

Templates#

The pydanticgen module has a templating system that allows each part of a schema to be generated independently and customized. See the documentation for the individual classes, but in short - each part of the output pydantic domain has a model with a corresponding template. At render time, each model is recursively rendered.

The PydanticGenerator then serves as a translation layer between the source models from linkml_runtime and the target models in pydanticgen.template , making clear what is needed to generate pydantic code as well as what parts of the linkml metamodel are supported.

Usage example:

Imports:

imports = (Imports() +
    Import(module="sys") +
    Import(module="pydantic", objects=[{"name": "BaseModel"}, {"name": "Field"}])
)

renders to:

import sys
from pydantic import (
    BaseModel,
    Field
)

Attributes:

attr = PydanticAttribute(
    name="my_field",
    annotations={"python_range": {"value": "str"}},
    title="My Field!",
    description="A Field that is mine!",
    pattern="my_.*",
)

By itself, renders to:

my_field: str = Field(None, title="My Field!", description="""A Field that is mine!""")

Classes:

cls = PydanticClass(
    name="MyClass",
    bases="BaseModel",
    description="A Class I Made!",
    attributes={"my_field": attr},
)

Renders to (along with the validator for the attribute):

class MyClass(BaseModel):
    my_field: str = Field(None, title="My Field!", description="""A Field that is mine!""")

    @validator('my_field', allow_reuse=True)
    def pattern_my_field(cls, v):
        pattern=re.compile(r"my_.*")
        if isinstance(v,list):
            for element in v:
                if not pattern.match(element):
                    raise ValueError(f"Invalid my_field format: {element}")
        elif isinstance(v,str):
            if not pattern.match(v):
                raise ValueError(f"Invalid my_field format: {v}")
        return v

Modules:

module = PydanticModule(imports=imports, classes={cls.name: cls})

Combine all the pieces:

import sys
from pydantic import (
    BaseModel,
    Field
)

metamodel_version = "None"
version = "None"

class WeakRefShimBaseModel(BaseModel):
    __slots__ = '__weakref__'


class ConfiguredBaseModel(WeakRefShimBaseModel,
                validate_assignment = True,
                validate_all = True,
                underscore_attrs_are_private = True,
                extra = "forbid",
                arbitrary_types_allowed = True,
                use_enum_values = True):
    pass


class MyClass(BaseModel):
    my_field: str = Field(None, title="My Field!", description="""A Field that is mine!""")

    @validator('my_field', allow_reuse=True)
    def pattern_my_field(cls, v):
        pattern=re.compile(r"my_.*")
        if isinstance(v,list):
            for element in v:
                if not pattern.match(element):
                    raise ValueError(f"Invalid my_field format: {element}")
        elif isinstance(v,str):
            if not pattern.match(v):
                raise ValueError(f"Invalid my_field format: {v}")
        return v


# Update forward refs
# see https://pydantic-docs.helpmanual.io/usage/postponed_annotations/
MyClass.update_forward_refs()
class linkml.generators.pydanticgen.template.TemplateModel[source]#

Metaclass to render pydantic models with jinja templates.

Each subclass needs to declare a typing.ClassVar for a jinja template within the templates directory.

Templates are written expecting each of the other TemplateModels to already be rendered to strings - ie. rather than the class.py.jinja template receiving a full PydanticAttribute object or dictionary, it receives it having already been rendered to a string. See the render() method.

Black Formatting

Template models will try to use black to format results when it is available in the environment when render is called with black = True . If it isn’t, then the string is returned without any formatting beyond the template. This is mostly important for complex annotations like those produced for arrays, as otherwise the templates are acceptable looking.

To install linkml with black, use the extra black dependency.

e.g. with pip:

pip install linkml[black]

or with poetry:

poetry install -E black
template: ClassVar[str]#
meta_exclude: ClassVar[List[str]] = None#
render(environment: Environment | None = None, black: bool = False) str[source]#

Recursively render a template model to a string.

For each field in the model, recurse through, rendering each TemplateModel using the template set in TemplateModel.template , but preserving the structure of lists and dictionaries. Regular BaseModel s are rendered to dictionaries. Any other value is passed through unchanged.

Parameters:
classmethod environment() Environment[source]#

Default environment for Template models. uses a jinja2.PackageLoader for the templates directory within this module with the trim_blocks and lstrip_blocks parameters set to True so that the default templates could be written in a more readable way.

classmethod exclude_from_meta() List[str][source]#

Attributes in the source definition to exclude from linkml_meta

model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.EnumValue(*, label: str, value: str, description: str | None = None)[source]#

A single value within an Enum

label: str#
value: str#
description: str | None#
model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'description': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'label': FieldInfo(annotation=str, required=True), 'value': FieldInfo(annotation=str, required=True)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.PydanticEnum(*, name: str, description: str | None = None, values: Dict[str, EnumValue] = None)[source]#

Model used to render a enum.Enum

template: ClassVar[str] = 'enum.py.jinja'#
name: str#
description: str | None#
values: Dict[str, EnumValue]#
model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'description': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'name': FieldInfo(annotation=str, required=True), 'values': FieldInfo(annotation=Dict[str, linkml.generators.pydanticgen.template.EnumValue], required=False, default_factory=dict)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.PydanticBaseModel(*, name: str = None, extra_fields: Literal['allow', 'forbid', 'ignore'] = 'forbid', fields: List[str] | None = None, strict: bool = False)[source]#

Parameterization of the base model that generated pydantic classes inherit from

template: ClassVar[str] = 'base_model.py.jinja'#
default_name: ClassVar[str] = 'ConfiguredBaseModel'#
name: str#
extra_fields: Literal['allow', 'forbid', 'ignore']#

Sets the extra model for pydantic models

fields: List[str] | None#

Extra fields that are typically injected into the base model via injected_fields

strict: bool#

Enable strict mode in the base model.

Note

Pydantic 2 only! Pydantic 1 only has strict types, not strict mode. See: https://github.com/linkml/linkml/issues/1955

References

https://docs.pydantic.dev/latest/concepts/strict_mode

model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'extra_fields': FieldInfo(annotation=Literal['allow', 'forbid', 'ignore'], required=False, default='forbid'), 'fields': FieldInfo(annotation=Union[List[str], NoneType], required=False, default=None), 'name': FieldInfo(annotation=str, required=False, default_factory=<lambda>), 'strict': FieldInfo(annotation=bool, required=False, default=False)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.PydanticAttribute(*, name: str, required: bool = False, identifier: bool = False, key: bool = False, predefined: str | None = None, range: str | None = None, title: str | None = None, description: str | None = None, equals_number: int | float | None = None, minimum_value: int | float | None = None, maximum_value: int | float | None = None, pattern: str | None = None, meta: Dict[str, Any] | None = None)[source]#

Reduced version of SlotDefinition that carries all and only the information needed by the template

template: ClassVar[str] = 'attribute.py.jinja'#
meta_exclude: ClassVar[List[str]] = ['from_schema', 'owner', 'range', 'multivalued', 'inlined', 'inlined_as_list']#
name: str#
required: bool#
identifier: bool#
key: bool#
predefined: str | None#

Fixed string to use in body of field

range: str | None#

Type annotation used for model field

title: str | None#
description: str | None#
equals_number: int | float | None#
minimum_value: int | float | None#
maximum_value: int | float | None#
pattern: str | None#
meta: Dict[str, Any] | None#

Metadata for the slot to be included in a Field annotation

property field: str#

Computed value to use inside of the generated Field

model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {'field': ComputedFieldInfo(wrapped_property=<property object>, return_type=<class 'str'>, alias=None, alias_priority=None, title=None, field_title_generator=None, description='Computed value to use inside of the generated Field', deprecated=None, examples=None, json_schema_extra=None, repr=True)}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'description': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'equals_number': FieldInfo(annotation=Union[int, float, NoneType], required=False, default=None), 'identifier': FieldInfo(annotation=bool, required=False, default=False), 'key': FieldInfo(annotation=bool, required=False, default=False), 'maximum_value': FieldInfo(annotation=Union[int, float, NoneType], required=False, default=None), 'meta': FieldInfo(annotation=Union[Dict[str, Any], NoneType], required=False, default=None), 'minimum_value': FieldInfo(annotation=Union[int, float, NoneType], required=False, default=None), 'name': FieldInfo(annotation=str, required=True), 'pattern': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'predefined': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'range': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'required': FieldInfo(annotation=bool, required=False, default=False), 'title': FieldInfo(annotation=Union[str, NoneType], required=False, default=None)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.PydanticValidator(*, name: str, required: bool = False, identifier: bool = False, key: bool = False, predefined: str | None = None, range: str | None = None, title: str | None = None, description: str | None = None, equals_number: int | float | None = None, minimum_value: int | float | None = None, maximum_value: int | float | None = None, pattern: str | None = None, meta: Dict[str, Any] | None = None)[source]#

Trivial subclass of PydanticAttribute that uses the validator.py.jinja template instead

template: ClassVar[str] = 'validator.py.jinja'#
model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {'field': ComputedFieldInfo(wrapped_property=<property object>, return_type=<class 'str'>, alias=None, alias_priority=None, title=None, field_title_generator=None, description='Computed value to use inside of the generated Field', deprecated=None, examples=None, json_schema_extra=None, repr=True)}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'description': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'equals_number': FieldInfo(annotation=Union[int, float, NoneType], required=False, default=None), 'identifier': FieldInfo(annotation=bool, required=False, default=False), 'key': FieldInfo(annotation=bool, required=False, default=False), 'maximum_value': FieldInfo(annotation=Union[int, float, NoneType], required=False, default=None), 'meta': FieldInfo(annotation=Union[Dict[str, Any], NoneType], required=False, default=None), 'minimum_value': FieldInfo(annotation=Union[int, float, NoneType], required=False, default=None), 'name': FieldInfo(annotation=str, required=True), 'pattern': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'predefined': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'range': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'required': FieldInfo(annotation=bool, required=False, default=False), 'title': FieldInfo(annotation=Union[str, NoneType], required=False, default=None)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.PydanticClass(*, name: str, bases: List[str] | str = 'ConfiguredBaseModel', description: str | None = None, attributes: Dict[str, PydanticAttribute] | None = None, meta: Dict[str, Any] | None = None)[source]#

Reduced version of ClassDefinition that carries all and only the information needed by the template.

On instantiation and rendering, will create any additional validators that are implied by the given attributes. Currently the only kind of slot-level validators that are created are for those slots that have a pattern property.

template: ClassVar[str] = 'class.py.jinja'#
meta_exclude: ClassVar[List[str]] = ['slots', 'is_a']#
name: str#
bases: List[str] | str#
description: str | None#
attributes: Dict[str, PydanticAttribute] | None#
meta: Dict[str, Any] | None#

Metadata for the class to be included in a linkml_meta class attribute

property validators: Dict[str, PydanticValidator] | None#
property slots: Dict[str, PydanticAttribute] | None#

alias of attributes

model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {'slots': ComputedFieldInfo(wrapped_property=<property object>, return_type=typing.Optional[typing.Dict[str, linkml.generators.pydanticgen.template.PydanticAttribute]], alias=None, alias_priority=None, title=None, field_title_generator=None, description='alias of attributes', deprecated=None, examples=None, json_schema_extra=None, repr=True), 'validators': ComputedFieldInfo(wrapped_property=<property object>, return_type=typing.Optional[typing.Dict[str, linkml.generators.pydanticgen.template.PydanticValidator]], alias=None, alias_priority=None, title=None, field_title_generator=None, description=None, deprecated=None, examples=None, json_schema_extra=None, repr=True)}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'attributes': FieldInfo(annotation=Union[Dict[str, linkml.generators.pydanticgen.template.PydanticAttribute], NoneType], required=False, default=None), 'bases': FieldInfo(annotation=Union[List[str], str], required=False, default='ConfiguredBaseModel'), 'description': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'meta': FieldInfo(annotation=Union[Dict[str, Any], NoneType], required=False, default=None), 'name': FieldInfo(annotation=str, required=True)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.ObjectImport(*, name: str, alias: str | None = None)[source]#

An object to be imported from within a module.

See Import for examples

name: str#
alias: str | None#
model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'alias': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'name': FieldInfo(annotation=str, required=True)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.Import(*, module: str, alias: str | None = None, objects: List[ObjectImport] | None = None, schema: bool = False)[source]#

A python module, or module and classes to be imported.

Examples

Module import:

>>> Import(module='sys').render()
import sys
>>> Import(module='numpy', alias='np').render()
import numpy as np

Class import:

>>> Import(module='pathlib', objects=[
>>>     ObjectImport(name="Path"),
>>>     ObjectImport(name="PurePath", alias="RenamedPurePath")
>>> ]).render()
from pathlib import (
    Path,
    PurePath as RenamedPurePath
)
template: ClassVar[str] = 'imports.py.jinja'#
module: str#
alias: str | None#
objects: List[ObjectImport] | None#
merge(other: Import) List[Import][source]#

Merge one import with another, see Imports() for an example.

  • If module don’t match, return both

  • If one or the other are a ConditionalImport, return both

  • If modules match, neither contain objects, but the other has an alias, return the other

  • If modules match, one contains objects but the other doesn’t, return both

  • If modules match, both contain objects, merge the object lists, preferring objects with aliases

model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'alias': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'module': FieldInfo(annotation=str, required=True), 'objects': FieldInfo(annotation=Union[List[linkml.generators.pydanticgen.template.ObjectImport], NoneType], required=False, default=None), 'schema': FieldInfo(annotation=bool, required=False, default=False)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.ConditionalImport(*, module: str, alias: str | None = None, objects: ~typing.List[~linkml.generators.pydanticgen.template.ObjectImport] | None = None, schema: bool = <bound method BaseModel.schema of <class 'linkml.generators.pydanticgen.template.ConditionalImport'>>, condition: str, alternative: ~linkml.generators.pydanticgen.template.Import)[source]#

Import that depends on some condition in the environment, common when using backported features or straddling dependency versions.

Make sure that everything that is needed to evaluate the condition is imported before this is added to the injected imports!

Examples

conditionally import Literal from typing_extensions if on python <= 3.8

imports = (Imports() +
     Import(module='sys') +
     ConditionalImport(
     module="typing",
     objects=[ObjectImport(name="Literal")],
     condition="sys.version_info >= (3, 8)",
     alternative=Import(
         module="typing_extensions",
         objects=[ObjectImport(name="Literal")]
     )
 )

Renders to:

import sys
if sys.version_info >= (3, 8):
    from typing import Literal
else:
    from typing_extensions import Literal
template: ClassVar[str] = 'conditional_import.py.jinja'#
condition: str#
alternative: Import#
model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'alias': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'alternative': FieldInfo(annotation=Import, required=True), 'condition': FieldInfo(annotation=str, required=True), 'module': FieldInfo(annotation=str, required=True), 'objects': FieldInfo(annotation=Union[List[linkml.generators.pydanticgen.template.ObjectImport], NoneType], required=False, default=None), 'schema': FieldInfo(annotation=bool, required=False, default=<bound method BaseModel.schema of <class 'linkml.generators.pydanticgen.template.ConditionalImport'>>)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.Imports(*, imports: List[Import | ConditionalImport] = None)[source]#

Container class for imports that can handle merging!

See Import and ConditionalImport for examples of declaring individual imports

Useful for generation, because each build stage will potentially generate overlapping imports. This ensures that we can keep a collection of imports without having many duplicates.

Defines methods for adding, iterating, and indexing from within the Imports.imports list.

Examples

imports = (Imports() +
    Import(module="sys") +
    Import(module="pathlib", objects=[ObjectImport(name="Path")]) +
    Import(module="sys")
)

Renders to:

from pathlib import Path
import sys
template: ClassVar[str] = 'imports.py.jinja'#
imports: List[Import | ConditionalImport]#
classmethod imports_are_merged(imports: List[Import | ConditionalImport]) List[Import | ConditionalImport][source]#

When creating from a list of imports, construct model as if we have done so by iteratively constructing with __add__ calls

model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'imports': FieldInfo(annotation=List[Union[linkml.generators.pydanticgen.template.Import, linkml.generators.pydanticgen.template.ConditionalImport]], required=False, default_factory=list)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

class linkml.generators.pydanticgen.template.PydanticModule(*, metamodel_version: str | None = None, version: str | None = None, base_model: PydanticBaseModel = PydanticBaseModel(name='ConfiguredBaseModel', extra_fields='forbid', fields=None, strict=False), injected_classes: List[str] | None = None, python_imports: List[Import | ConditionalImport] = None, enums: Dict[str, PydanticEnum] = None, classes: Dict[str, PydanticClass] = None, meta: Dict[str, Any] | None = None)[source]#

Top-level container model for generating a pydantic module :)

template: ClassVar[str] = 'module.py.jinja'#
meta_exclude: ClassVar[str] = ['slots']#
model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] = {'class_names': ComputedFieldInfo(wrapped_property=<property object>, return_type=typing.List[str], alias=None, alias_priority=None, title=None, field_title_generator=None, description=None, deprecated=None, examples=None, json_schema_extra=None, repr=True)}#

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_config: ClassVar[ConfigDict] = {}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'base_model': FieldInfo(annotation=PydanticBaseModel, required=False, default=PydanticBaseModel(name='ConfiguredBaseModel', extra_fields='forbid', fields=None, strict=False)), 'classes': FieldInfo(annotation=Dict[str, linkml.generators.pydanticgen.template.PydanticClass], required=False, default_factory=dict), 'enums': FieldInfo(annotation=Dict[str, linkml.generators.pydanticgen.template.PydanticEnum], required=False, default_factory=dict), 'injected_classes': FieldInfo(annotation=Union[List[str], NoneType], required=False, default=None), 'meta': FieldInfo(annotation=Union[Dict[str, Any], NoneType], required=False, default=None), 'metamodel_version': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'python_imports': FieldInfo(annotation=List[Union[linkml.generators.pydanticgen.template.Import, linkml.generators.pydanticgen.template.ConditionalImport]], required=False, default_factory=list), 'version': FieldInfo(annotation=Union[str, NoneType], required=False, default=None)}#

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

metamodel_version: str | None#
version: str | None#
base_model: PydanticBaseModel#
injected_classes: List[str] | None#
python_imports: List[Import | ConditionalImport]#
enums: Dict[str, PydanticEnum]#
classes: Dict[str, PydanticClass]#
meta: Dict[str, Any] | None#

Metadata for the schema to be included in a linkml_meta module-level instance of LinkMLMeta

property class_names: List[str]#

Arrays#

TODO

Narrative documentation for pydantic LoL Arrays. Subsection this by different array reps

See Schemas/Arrays

class linkml.generators.pydanticgen.array.ArrayRepresentation(value)[source]#

An enumeration.

class linkml.generators.pydanticgen.array.AnyShapeArrayType[source]#
class linkml.generators.pydanticgen.array.ArrayRangeGenerator(array: ArrayExpression | None, dtype: str | Element)[source]#

Metaclass for generating a given format of array range.

See Shape Forms for more details on array range forms.

These classes do only enough validation of the array specification to decide which kind of representation to generate. Proper value validation should happen elsewhere (ie. in the metamodel and generated ArrayExpression class.)

Each of the array representation generation methods should be able to handle the supported pydantic versions (currently still 1 and 2).

Notes

When checking for array specification, recall that there is a semantic difference between None and False , particularly for ArrayExpression.max_number_dimensions - check for absence of specification with is None rather than checking for truthiness/falsiness (unless that’s what you intend to do ofc ;)

array#

Array to create a range for

Type:

ArrayExpression

dtype#

dtype of the entire array as a string

Type:

Union[str, Element

make() RangeResult[source]#

Create the string form of the array representation

property has_bounded_dimensions: bool#

Whether the ArrayExpression has some shape specification aside from dimensions

classmethod get_generator(repr: ArrayRepresentation) Type[ArrayRangeGenerator][source]#

Get the generator class for a given array representation

abstract any_shape(array: ArrayRepresentation | None = None) RangeResult[source]#

Any shaped array!

abstract bounded_dimensions(array: ArrayExpression) RangeResult[source]#

Array shape specified numerically, without axis parameterization

abstract parameterized_dimensions(array: ArrayExpression) RangeResult[source]#

Array shape specified with dimensions without additional parameterized dimensions

abstract complex_dimensions(array: ArrayExpression) RangeResult[source]#

Array shape with both parameterized and bounded dimensions

class linkml.generators.pydanticgen.array.ListOfListsArray(array: ArrayExpression | None, dtype: str | Element)[source]#

Represent arrays as lists of lists!

TODO: Move all validation of values (eg. anywhere we raise a ValueError) to the ArrayExpression dataclass and out of the generator class

any_shape(array: ArrayExpression | None = None, with_inner_union: bool = False) RangeResult[source]#

An AnyShaped array (using AnyShapeArray )

Parameters:
  • array (ArrayExpression) – The array expression (not used)

  • with_inner_union (bool) – If True , the innermost type is a Union of the AnyShapeArray class and dtype (default: False )

bounded_dimensions(array: ArrayExpression) RangeResult[source]#

A nested series of List[] ranges with dtype at the center.

When an array expression allows for a range of dimensions, each set of List s is joined by a Union .

parameterized_dimensions(array: ArrayExpression) RangeResult[source]#

Constrained shapes using pydantic.conlist()

TODO: - preservation of aliases - (what other metadata is allowable on labeled dimensions?)

complex_dimensions(array: ArrayExpression) RangeResult[source]#

Mixture of parameterized dimensions with a max or min (or both) shape for anonymous dimensions.

A mixture of List , conlist , and AnyShapeArray .

class linkml.generators.pydanticgen.array.NPTypingArray(**kwargs)[source]#

Represent array range with nptyping, and serialization/loading with an ArrayProxy

Additional Notes#

LinkML contains two Python generators. The Pydantic dataclass generator is specifically useful for FastAPI, but is newer and less full featured than the standard Python generator.