Skip to content

mapping

mindnlp.peft.mapping

mappings

mindnlp.peft.mapping.get_peft_config(config_dict)

Returns a Peft config object from a dictionary.

PARAMETER DESCRIPTION
config_dict

Dictionary containing the configuration parameters.

TYPE: `Dict[str, Any]`

Source code in mindnlp/peft/mapping.py
87
88
89
90
91
92
93
94
def get_peft_config(config_dict: Dict[str, Any]):
    """
    Returns a Peft config object from a dictionary.

    Args:
        config_dict (`Dict[str, Any]`): Dictionary containing the configuration parameters.
    """
    return PEFT_TYPE_TO_CONFIG_MAPPING[config_dict["peft_type"]](**config_dict)

mindnlp.peft.mapping.get_peft_model(model, peft_config, adapter_name='default')

Returns a Peft model object from a model and a config.

PARAMETER DESCRIPTION
model

Model to be wrapped.

TYPE: [`transformers.PreTrainedModel`]

peft_config

Configuration object containing the parameters of the Peft model.

TYPE: [`PeftConfig`]

Source code in mindnlp/peft/mapping.py
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
def get_peft_model(model: nn.Module, peft_config: PeftConfig, adapter_name: str = "default") -> PeftModel:
    """
    Returns a Peft model object from a model and a config.

    Args:
        model ([`transformers.PreTrainedModel`]): Model to be wrapped.
        peft_config ([`PeftConfig`]): Configuration object containing the parameters of the Peft model.
    """
    model_config = getattr(model, "config", {"model_type": "custom"})
    if hasattr(model_config, "to_dict"):
        model_config = model_config.to_dict()
    peft_config.base_model_name_or_path = model.__dict__.get("name_or_path", None)

    # no specific task_type and is not prompt_learning
    if peft_config.task_type not in MODEL_TYPE_TO_PEFT_MODEL_MAPPING.keys() and not peft_config.is_prompt_learning:
        return PeftModel(model, peft_config, adapter_name=adapter_name)

    # TODO: prompt learning
    # if peft_config.is_prompt_learning:
    #     # peft_config = _prepare_prompt_learning_config(peft_config, model_config)
    return MODEL_TYPE_TO_PEFT_MODEL_MAPPING[peft_config.task_type](model, peft_config, adapter_name=adapter_name)

mindnlp.peft.mapping.inject_adapter_in_model(peft_config, model, adapter_name='default')

A simple API to create and inject adapter in-place into a model. Currently the API does not support prompt learning methods and adaption prompt. Make sure to have the correct target_names set in the peft_config object. The API calls get_peft_model under the hood but would be restricted only to non-prompt learning methods.

PARAMETER DESCRIPTION
peft_config

Configuration object containing the parameters of the Peft model.

TYPE: `PeftConfig`

model

The input model where the adapter will be injected.

TYPE: `nn.Module`

adapter_name

The name of the adapter to be injected, if not provided, the default adapter name is used ("default").

TYPE: `str`, `optional`, defaults to `"default"` DEFAULT: 'default'

Source code in mindnlp/peft/mapping.py
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
def inject_adapter_in_model(
    peft_config: PeftConfig, model: nn.Module, adapter_name: str = "default"
) -> nn.Module:
    r"""
    A simple API to create and inject adapter in-place into a model. Currently the API does not support prompt learning
    methods and adaption prompt. Make sure to have the correct `target_names` set in the `peft_config` object. The API
    calls `get_peft_model` under the hood but would be restricted only to non-prompt learning methods.

    Args:
        peft_config (`PeftConfig`):
            Configuration object containing the parameters of the Peft model.
        model (`nn.Module`):
            The input model where the adapter will be injected.
        adapter_name (`str`, `optional`, defaults to `"default"`):
            The name of the adapter to be injected, if not provided, the default adapter name is used ("default").
    """
    if peft_config.is_prompt_learning or peft_config.is_adaption_prompt:
        raise ValueError("`create_and_replace` does not support prompt learning and adaption prompt yet.")

    if peft_config.peft_type not in PEFT_TYPE_TO_TUNER_MAPPING.keys():
        raise ValueError(
            f"`inject_adapter_in_model` does not support {peft_config.peft_type} yet. Please use `get_peft_model`."
        )

    tuner_cls = PEFT_TYPE_TO_TUNER_MAPPING[peft_config.peft_type]

    # By instantiating a peft model we are injecting randomly initialized LoRA layers into the model's cells.
    peft_model = tuner_cls(model, peft_config, adapter_name=adapter_name)

    return peft_model.model