Skip to content

LoKr

mindnlp.peft.tuners.lokr.config

lokr.

mindnlp.peft.tuners.lokr.config.LoKrConfig dataclass

Bases: PeftConfig

This is the configuration class to store the configuration of a [LoraModel].

PARAMETER DESCRIPTION
r

lokr attention dimension.

TYPE: `int` DEFAULT: 8

target_cells

The names of the cells to apply Lora to.

TYPE: `Union[List[str],str]` DEFAULT: None

lora_alpha

The alpha parameter for Lokr scaling.

TYPE: `float` DEFAULT: 8

rank_dropout

The dropout probability for rank dimension during training.

TYPE: `float` DEFAULT: 0.0

cell_dropout

The dropout probability for LoKR layers.

TYPE: `float` DEFAULT: 0.0

use_effective_conv2d

Use parameter effective decomposition for Conv2d with ksize > 1 ("Proposition 3" from FedPara paper).

TYPE: `bool` DEFAULT: False

decompose_both

Perform rank decomposition of left kronecker product matrix.

TYPE: `bool` DEFAULT: False

decompose_factor

Kronecker product decomposition factor.

TYPE: `int` DEFAULT: -1

bias

Bias type for Lora. Can be 'none', 'all' or 'lora_only'

TYPE: `str` DEFAULT: 'none'

cells_to_save

List of cells apart from LoRA layers to be set as trainable and saved in the final checkpoint.

TYPE: `List[str]` DEFAULT: None

init_weights

Whether to perform initialization of adapter weights. This defaults to True, passing False is discouraged.

TYPE: `bool` DEFAULT: True

layers_to_transform

The layer indexes to transform, if this argument is specified, it will apply the LoRA transformations on the layer indexes that are specified in this list. If a single integer is passed, it will apply the LoRA transformations on the layer at this index.

TYPE: `Union[List[int],int]` DEFAULT: None

layers_pattern

The layer pattern name, used only if layers_to_transform is different from None and if the layer pattern is not in the common layers pattern.

TYPE: `str` DEFAULT: None

rank_pattern

The mapping from layer names or regexp expression to ranks which are different from the default rank specified by r.

TYPE: `dict` DEFAULT: dict()

alpha_pattern

The mapping from layer names or regexp expression to alphas which are different from the default alpha specified by alpha.

TYPE: `dict` DEFAULT: dict()

Source code in mindnlp/peft/tuners/lokr/config.py
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
@dataclass
class LoKrConfig(PeftConfig):
    """
    This is the configuration class to store the configuration of a [`LoraModel`].

    Args:
        r (`int`): lokr attention dimension.
        target_cells (`Union[List[str],str]`): The names of the cells to apply Lora to.
        lora_alpha (`float`): The alpha parameter for Lokr scaling.
        rank_dropout (`float`):The dropout probability for rank dimension during training.
        cell_dropout (`float`): The dropout probability for LoKR layers.
        use_effective_conv2d (`bool`):
            Use parameter effective decomposition for
            Conv2d with ksize > 1 ("Proposition 3" from FedPara paper).
        decompose_both (`bool`):Perform rank decomposition of left kronecker product matrix.
        decompose_factor (`int`):Kronecker product decomposition factor.

        bias (`str`): Bias type for Lora. Can be 'none', 'all' or 'lora_only'
        cells_to_save (`List[str]`):
            List of cells apart from LoRA layers to be set as trainable
            and saved in the final checkpoint.
        init_weights (`bool`):
            Whether to perform initialization of adapter weights. This defaults to `True`, 
            passing `False` is discouraged.
        layers_to_transform (`Union[List[int],int]`):
            The layer indexes to transform, if this argument is specified, it will apply the LoRA transformations on
            the layer indexes that are specified in this list. If a single integer is passed, it will apply the LoRA
            transformations on the layer at this index.
        layers_pattern (`str`):
            The layer pattern name, used only if `layers_to_transform` is different from `None` and if the layer
            pattern is not in the common layers pattern.
        rank_pattern (`dict`):
            The mapping from layer names or regexp expression to ranks which are different from the default rank
            specified by `r`.
        alpha_pattern (`dict`):
            The mapping from layer names or regexp expression to alphas which are different from the default alpha
            specified by `alpha`.
    """
    r: int = field(default=8, metadata={"help": "lokr attention dimension"})
    target_cells: Optional[Union[List[str], str]] = field(
        default=None,
        metadata={
            "help": "List of cell names or regex expression of the cell names to replace with Lora."
            "For example, ['q', 'v'] or '.*decoder.*(SelfAttention|EncDecAttention).*(q|v)$' "
        },
    )
    lora_alpha: int = field(default=8, metadata={"help": "lokr alpha"})
    rank_dropout: float = field(
        default=0.0,
        metadata={"help": "The dropout probability for rank dimension during training"},
    )
    cell_dropout: float = field(default=0.0, metadata={"help": "lokr dropout"})
    use_effective_conv2d: bool = field(
        default=False,
        metadata={
            "help": 'Use parameter effective decomposition for Conv2d 3x3 with ksize > 1 ("Proposition 3" from FedPara paper)'
        },
    )
    decompose_both: bool = field(
        default=False,
        metadata={
            "help": "Perform rank decomposition of left kronecker product matrix."
        },
    )
    decompose_factor: int = field(
        default=-1, metadata={"help": "Kronecker product decomposition factor."}
    )

    bias: str = field(
        default="none",
        metadata={"help": "Bias type for Lora. Can be 'none', 'all' or 'lora_only'"},
    )
    cells_to_save: Optional[List[str]] = field(
        default=None,
        metadata={
            "help": "List of cells apart from LoRA layers to be set as trainable and saved in the final checkpoint. "
            "For example, in Sequence Classification or Token Classification tasks, "
            "the final layer `classifier/score` are randomly initialized and as such need to be trainable and saved."
        },
    )
    init_weights: bool = field(
        default=True,
        metadata={"help": "Whether to initialize the weights of the Lora layers."},
    )
    layers_to_transform: Optional[Union[List, int]] = field(
        default=None,
        metadata={
            "help": "The layer indexes to transform, is this argument is specified, \
                PEFT will transform only the layers indexes that are specified inside this list. \
                If a single integer is passed, PEFT will transform only the layer at this index."
        },
    )
    layers_pattern: Optional[str] = field(
        default=None,
        metadata={
            "help": "The layer pattern name, used only if `layers_to_transform` is different to None and \
                  if the layer pattern is not in the common layers pattern."
        },
    )
    rank_pattern: Optional[dict] = field(
        default_factory=dict,
        metadata={
            "help": (
                "The mapping from layer names or regexp expression to ranks which are different from the default rank specified by `r`. "
                "For example, `{model.decoder.layers.0.encoder_attn.k_proj: 8`}"
            )
        },
    )
    alpha_pattern: Optional[dict] = field(
        default_factory=dict,
        metadata={
            "help": (
                "The mapping from layer names or regexp expression to alphas which are different from the default alpha specified by `alpha`. "
                "For example, `{model.decoder.layers.0.encoder_attn.k_proj: 32`}"
            )
        },
    )

    def __post_init__(self):
        r"""
        Method to initialize the attributes of the LoKrConfig class after object creation.

        Args:
            self: Instance of the LoKrConfig class.

        Returns:
            None. This method performs attribute initialization within the class.

        Raises:
            No specific exceptions are raised within this method.
        """
        self.peft_type = PeftType.LOKR

    @property
    def is_prompt_learning(self):
        r"""
        Utility method to check if the configuration is for prompt learning.
        """
        return False

mindnlp.peft.tuners.lokr.config.LoKrConfig.is_prompt_learning property

Utility method to check if the configuration is for prompt learning.

mindnlp.peft.tuners.lokr.config.LoKrConfig.__post_init__()

Method to initialize the attributes of the LoKrConfig class after object creation.

PARAMETER DESCRIPTION
self

Instance of the LoKrConfig class.

RETURNS DESCRIPTION

None. This method performs attribute initialization within the class.

Source code in mindnlp/peft/tuners/lokr/config.py
141
142
143
144
145
146
147
148
149
150
151
152
153
154
def __post_init__(self):
    r"""
    Method to initialize the attributes of the LoKrConfig class after object creation.

    Args:
        self: Instance of the LoKrConfig class.

    Returns:
        None. This method performs attribute initialization within the class.

    Raises:
        No specific exceptions are raised within this method.
    """
    self.peft_type = PeftType.LOKR

mindnlp.peft.tuners.lokr.model

Lokr.

mindnlp.peft.tuners.lokr.model.LoKrModel

Bases: BaseTuner

Creates Low-Rank Kronecker Product model from a pretrained model. The original method is partially described in https://arxiv.org/abs/2108.06098 and in https://arxiv.org/abs/2309.14859 Current implementation heavily borrows from https://github.com/KohakuBlueleaf/LyCORIS/blob/eb460098187f752a5d66406d3affade6f0a07ece/lycoris/cells/lokr.py

PARAMETER DESCRIPTION
model

The model to which the adapter tuner layers will be attached.

TYPE: `mindspore.nn.Module`

peft_config

The configuration of the LoKr model.

TYPE: [`LoKrConfig`]

adapter_name

The name of the adapter, defaults to "default".

TYPE: `str`

RETURNS DESCRIPTION
LoKrModel

The LoKr model.

TYPE: [`mindspore.nn.Module`]

Example
>>> from diffusers import StableDiffusionPipeline
>>> from peft import LoKrModel, LoKrConfig

>>> config_te = LoKrConfig(
...     r=8,
...     lora_alpha=32,
...     target_cells=["k_proj", "q_proj", "v_proj", "out_proj", "fc1", "fc2"],
...     rank_dropout=0.0,
...     cell_dropout=0.0,
...     init_weights=True,
... )
>>> config_unet = LoKrConfig(
...     r=8,
...     lora_alpha=32,
...     target_cells=[
...         "proj_in",
...         "proj_out",
...         "to_k",
...         "to_q",
...         "to_v",
...         "to_out.0",
...         "ff.net.0.proj",
...         "ff.net.2",
...     ],
...     rank_dropout=0.0,
...     cell_dropout=0.0,
...     init_weights=True,
...     use_effective_conv2d=True,
... )

>>> model = StableDiffusionPipeline.from_pretrained("runwayml/stable-diffusion-v1-5")
>>> model.text_encoder = LoKrModel(model.text_encoder, config_te, "default")
>>> model.unet = LoKrModel(model.unet, config_unet, "default")

Attributes:

  • model ([~nn.Module])— The model to be adapted.

  • peft_config ([LoKrConfig]): The configuration of the LoKr model.

Source code in mindnlp/peft/tuners/lokr/model.py
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
class LoKrModel(BaseTuner):
    """
    Creates Low-Rank Kronecker Product model from a pretrained model. The original method is partially described in
    https://arxiv.org/abs/2108.06098 and in https://arxiv.org/abs/2309.14859 Current implementation heavily borrows
    from
    https://github.com/KohakuBlueleaf/LyCORIS/blob/eb460098187f752a5d66406d3affade6f0a07ece/lycoris/cells/lokr.py

    Args:
        model (`mindspore.nn.Module`): The model to which the adapter tuner layers will be attached.
        peft_config ([`LoKrConfig`]): The configuration of the LoKr model.
        adapter_name (`str`): The name of the adapter, defaults to `"default"`.

    Returns:
        LoKrModel ([`mindspore.nn.Module`]): The LoKr model.

    Example:
        ```py
        >>> from diffusers import StableDiffusionPipeline
        >>> from peft import LoKrModel, LoKrConfig

        >>> config_te = LoKrConfig(
        ...     r=8,
        ...     lora_alpha=32,
        ...     target_cells=["k_proj", "q_proj", "v_proj", "out_proj", "fc1", "fc2"],
        ...     rank_dropout=0.0,
        ...     cell_dropout=0.0,
        ...     init_weights=True,
        ... )
        >>> config_unet = LoKrConfig(
        ...     r=8,
        ...     lora_alpha=32,
        ...     target_cells=[
        ...         "proj_in",
        ...         "proj_out",
        ...         "to_k",
        ...         "to_q",
        ...         "to_v",
        ...         "to_out.0",
        ...         "ff.net.0.proj",
        ...         "ff.net.2",
        ...     ],
        ...     rank_dropout=0.0,
        ...     cell_dropout=0.0,
        ...     init_weights=True,
        ...     use_effective_conv2d=True,
        ... )

        >>> model = StableDiffusionPipeline.from_pretrained("runwayml/stable-diffusion-v1-5")
        >>> model.text_encoder = LoKrModel(model.text_encoder, config_te, "default")
        >>> model.unet = LoKrModel(model.unet, config_unet, "default")
        ```

    > **Attributes**:  

    >   - **model** ([`~nn.Module`])— The model to be adapted. 

    >   - **peft_config** ([`LoKrConfig`]): The configuration of the LoKr  model. 

    """
    prefix: str = "lokr_"
    layers_mapping: Dict[Type[nn.Module], Type[LoKrLayer]] = {
        nn.Conv2d: Conv2d,
        nn.Linear: Dense,
    }

    def _create_and_replace(
        self,
        config: LoKrConfig,
        adapter_name: str,
        target: Union[LoKrLayer, nn.Module],
        target_name: str,
        parent: nn.Module,
        current_key: str,
        loaded_in_8bit: Optional[bool] = False,
        loaded_in_4bit: Optional[bool] = False,
    ) -> None:
        """
        A private method to create and replace the target cell with the adapter cell.
        """
        # Regexp matching - Find key which matches current target_name in patterns provided
        pattern_keys = list(
            chain(config.rank_pattern.keys(), config.alpha_pattern.keys())
        )
        target_name_key = next(
            filter(lambda key: re.match(rf"(.*\.)?{key}$", current_key), pattern_keys),
            target_name,
        )

        kwargs = config.to_dict()
        kwargs["r"] = config.rank_pattern.get(target_name_key, config.r)
        kwargs["alpha"] = config.alpha_pattern.get(target_name_key, config.lora_alpha)

        if isinstance(target, LoKrLayer):
            target.update_layer(adapter_name, **kwargs)
        else:
            new_cell = self._create_new_cell(config, adapter_name, target, **kwargs)
            self._replace_cell(parent, target_name, new_cell, target)

    @classmethod
    def _create_new_cell(
        cls, config: LoKrConfig, adapter_name: str, target: nn.Module, **kwargs
    ) -> LoKrLayer:
        r"""
        This method creates a new LoKrLayer instance based on the provided parameters.

        Args:
            cls (class): The class reference. It is used to access the class-level layers_mapping attribute.
            config (LoKrConfig): The configuration object used for creating the new cell.
            adapter_name (str): The name of the adapter to be associated with the new cell.
            target (nn.Module): The target cell for which the new cell is being created.

        Returns:
            LoKrLayer: Returns a new instance of LoKrLayer representing the created cell.

        Raises:
            ValueError: If the target cell type is not supported, an exception is raised, indicating the unsupported cell type. 
                This occurs when the target cell type does not match any of the supported cell types in the layers_mapping attribute.
        """
        # Find corresponding subtype of provided target cell
        new_cell_cls = None
        for subtype, target_cls in cls.layers_mapping.items():
            if (
                hasattr(target, "base_layer")
                and isinstance(target.get_base_layer(), subtype)
                and isinstance(target, BaseTunerLayer)
            ):
                # nested tuner layers are allowed
                new_cell_cls = target_cls
                break
            elif isinstance(target, subtype):
                new_cell_cls = target_cls
                break

        # We didn't find corresponding type, so adapter for this layer is not supported
        if new_cell_cls is None:
            supported_cells = ", ".join(
                layer.__name__ for layer in cls.layers_mapping.keys()
            )
            raise ValueError(
                f"Target cell of type {type(target)} not supported, "
                f"currently only adapters for {supported_cells} are supported"
            )

        if isinstance(target, BaseTunerLayer):
            target_base_layer = target.get_base_layer()
        else:
            target_base_layer = target

        if isinstance(target_base_layer, nn.Module):
            new_cell = new_cell_cls(target, adapter_name=adapter_name, **kwargs)
        elif isinstance(target_base_layer, nn.Module):
            new_cell = new_cell_cls(target, adapter_name=adapter_name, **kwargs)
        else:
            supported_cells = ", ".join(
                layer.__name__ for layer in cls.layers_mapping.keys()
            )
            raise ValueError(
                f"Target cell of type {type(target)} not supported, "
                f"currently only adapters for {supported_cells} are supported"
            )

        return new_cell

    def __getattr__(self, name: str):
        """Forward missing attributes to the wrapped cell."""
        try:
            return super().__getattr__(name)  # defer to nn.Module's logic
        except AttributeError:
            return getattr(self.model, name)

    def _replace_cell(self, parent, child_name, new_cell, child):
        r"""
        Replaces a cell in the LoKrModel with a new cell.

        Args:
            self (LoKrModel): The instance of the LoKrModel class.
            parent: The parent object containing the cell to be replaced.
            child_name: The name of the child object to be replaced.
            new_cell: The new cell object to be assigned.
            child: The child object to be replaced.

        Returns:
            None. This method does not return any value.

        Raises:
            None.
        """
        setattr(parent, child_name, new_cell)

        # child layer wraps the original cell, unpack it
        if hasattr(child, "base_layer"):
            child = child.base_layer

        # layers with base_layer don't need the weight to be copied, as they have a reference already
        if not hasattr(new_cell, "base_layer"):
            new_cell.weight = child.weight
            if hasattr(child, "bias"):
                new_cell.bias = child.bias

        if getattr(child, "state", None) is not None:
            if hasattr(new_cell, "base_layer"):
                new_cell.base_layer.state = child.state
            else:
                new_cell.state = child.state

    def _mark_only_adapters_as_trainable(self, model: nn.Module) -> None:
        r"""
        The _mark_only_adapters_as_trainable method in the LoKrModel class marks only the adapters in the provided model as trainable, by setting the requires_grad attribute to False for parameters not
containing the specified prefix.

        Args:
            self (LoKrModel): The instance of the LoKrModel class.
            model (nn.Module): The model for which the adapters are to be marked as trainable.

        Returns:
            None: This method does not return any value.

        Raises:
            None
        """
        for n, p in model.parameters_and_names():
            if self.prefix not in n:
                p.requires_grad = False

    def _set_adapter_layers(self, enabled=True):
        r"""
        Sets the adapter layers in the LoKrModel by enabling or disabling them.

        Args:
            self (LoKrModel): The instance of the LoKrModel class.
            enabled (bool, optional): Indicates whether to enable or disable the adapter layers. Defaults to True.

        Returns:
            None. This method does not return any value.

        Raises:
            None.
        """
        for cell in self.model.cells():
            if isinstance(cell, (BaseTunerLayer, ModulesToSaveWrapper)):
                cell.enable_adapters(enabled)

    def _unload_and_optionally_merge(
        self,
        merge: bool = True,
        progressbar: bool = False,
        safe_merge: bool = False,
        adapter_names: Optional[List[str]] = None,
    ):
        """
        Method to unload and optionally merge the model.

        Args:
            self (LoKrModel): The current instance of the LoKrModel class.
            merge (bool): A flag indicating whether to merge the model. Defaults to True.
            progressbar (bool): A flag indicating whether to display a progress bar. Defaults to False.
            safe_merge (bool): A flag indicating whether to perform a safe merge. Defaults to False.
            adapter_names (Optional[List[str]]): A list of adapter names. Defaults to None.

        Returns:
            None: This method does not return any value.

        Raises:
            ValueError: If the model is gptq quantized and merge is True, it raises a ValueError with the message 
            "Cannot merge LOHA layers when the model is gptq quantized".
            AttributeError: If an attribute error occurs during the method execution.
        """
        if merge:
            if getattr(self.model, "quantization_method", None) == "gptq":
                raise ValueError(
                    "Cannot merge LOHA layers when the model is gptq quantized"
                )

        self._unloading_checks(adapter_names)
        key_list = [
            key for key, _ in self.model.named_cells() if self.prefix not in key
        ]
        desc = "Unloading " + ("and merging " if merge else "") + "model"
        for key in tqdm(key_list, disable=not progressbar, desc=desc):
            try:
                parent, target, target_name = _get_subcells(self.model, key)
            except AttributeError:
                continue

            if hasattr(target, "base_layer"):
                if merge:
                    target.merge(safe_merge=safe_merge, adapter_names=adapter_names)
                self._replace_cell(
                    parent, target_name, target.get_base_layer(), target
                )
            elif isinstance(target, ModulesToSaveWrapper):
                # save any additional trainable cells part of `cells_to_save`
                new_cell = target.cells_to_save[target.active_adapter]
                if hasattr(new_cell, "base_layer"):
                    # check if the cell is itself a tuner layer
                    if merge:
                        new_cell.merge(
                            safe_merge=safe_merge, adapter_names=adapter_names
                        )
                    new_cell = new_cell.get_base_layer()
                setattr(parent, target_name, new_cell)

        return self.model

    def _unloading_checks(self, adapter_names: Optional[List[str]]):
        r"""
        Perform unloading checks for the LoKrModel class.

        This method checks if multiple adapters with `cells_to_save` specified can be unloaded.
        If any of the specified adapters have cells to save, unloading multiple adapters is not allowed.

        Args:
            self (LoKrModel): An instance of the LoKrModel class.
            adapter_names (Optional[List[str]]): A list of adapter names to consider for unloading. If not provided, all active adapters will be considered.

        Returns:
            None. This method does not return any value.

        Raises:
            ValueError: If multiple adapters with `cells_to_save` specified are attempted to be unloaded.

        """
        adapters_to_consider = adapter_names or self.active_adapters
        is_cells_to_save_available = any(
            self.peft_config[adapter].cells_to_save
            for adapter in adapters_to_consider
        )
        if is_cells_to_save_available and len(adapters_to_consider) > 1:
            raise ValueError(
                "Cannot unload multiple adapters that specify `cells_to_save`."
            )

    @staticmethod
    def _prepare_adapter_config(peft_config, model_config):
        r"""
        Prepare adapter configuration based on PEFT and model configurations.

        Args:
            peft_config (object): The configuration object for PEFT.
                It should contain information about the target cells.
                Required parameter. Must not be None.
            model_config (object): The configuration object for the model.

        Returns:
            None. This method does not return any value.

        Raises:
            ValueError: If `target_cells` is not specified in `peft_config`.
        """
        if peft_config.target_cells is None:
            raise ValueError("Please specify `target_cells` in `peft_config`")
        return peft_config

    @staticmethod
    def _check_target_cell_exists(LoKR_config, key):
        r"""
        Checks if a target cell exists in the LoKR configuration.

        Args:
            LoKR_config (dict): The LoKR configuration dictionary containing information about the target cells.
            key (str): The key corresponding to the target cell to be checked.

        Returns:
            None. Returns None if the target cell exists in the LoKR configuration; otherwise, raises an exception.

        Raises:
            This method does not raise any exceptions explicitly. However, if the target cell does not exist in the LoKR configuration, further handling may be required based on the context in which this
method is used.
        """
        return check_target_cell_exists(LoKR_config, key)

mindnlp.peft.tuners.lokr.model.LoKrModel.__getattr__(name)

Forward missing attributes to the wrapped cell.

Source code in mindnlp/peft/tuners/lokr/model.py
200
201
202
203
204
205
def __getattr__(self, name: str):
    """Forward missing attributes to the wrapped cell."""
    try:
        return super().__getattr__(name)  # defer to nn.Module's logic
    except AttributeError:
        return getattr(self.model, name)