Skip to content

config

mindnlp.peft.config

configs

mindnlp.peft.config.PeftConfig dataclass

Bases: PeftConfigMixin

This is the base configuration class to store the configuration of a [PeftModel].

PARAMETER DESCRIPTION
peft_type

The type of Peft method to use.

TYPE: Union[[`~peft.utils.config.PeftType`], `str`] DEFAULT: None

task_type

The type of task to perform.

TYPE: Union[[`~peft.utils.config.TaskType`], `str`] DEFAULT: None

inference_mode

Whether to use the Peft model in inference mode.

TYPE: `bool`, defaults to `False` DEFAULT: False

Source code in mindnlp/peft/config.py
138
139
140
141
142
143
144
145
146
147
148
149
150
151
@dataclass
class PeftConfig(PeftConfigMixin):
    """
    This is the base configuration class to store the configuration of a [`PeftModel`].

    Args:
        peft_type (Union[[`~peft.utils.config.PeftType`], `str`]): The type of Peft method to use.
        task_type (Union[[`~peft.utils.config.TaskType`], `str`]): The type of task to perform.
        inference_mode (`bool`, defaults to `False`): Whether to use the Peft model in inference mode.
    """
    base_model_name_or_path: str = field(default=None, metadata={"help": "The name of the base model to use."})
    peft_type: Union[str, PeftType] = field(default=None, metadata={"help": "Peft type"})
    task_type: Union[str, TaskType] = field(default=None, metadata={"help": "Task type"})
    inference_mode: bool = field(default=False, metadata={"help": "Whether to use inference mode"})

mindnlp.peft.config.PeftConfigMixin dataclass

This is the base configuration class for PEFT adapter models. It contains all the methods that are common to all PEFT adapter models. The method save_pretrained will save the configuration of your adapter model in a directory. The method from_pretrained will load the configuration of your adapter model from a directory.

PARAMETER DESCRIPTION
peft_type

The type of Peft method to use.

TYPE: Union[[`~peft.utils.config.PeftType`], `str`] DEFAULT: None

Source code in mindnlp/peft/config.py
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
@dataclass
class PeftConfigMixin():
    r"""
    This is the base configuration class for PEFT adapter models. 
    It contains all the methods that are common to all PEFT adapter models.
    The method `save_pretrained` will save the configuration of your adapter model in a directory.
    The method `from_pretrained` will load the configuration of your adapter model from a directory.

    Args:
        peft_type (Union[[`~peft.utils.config.PeftType`], `str`]): The type of Peft method to use.
    """
    peft_type: Optional[PeftType] = field(default=None, metadata={"help": "The type of PEFT model."})

    @property
    def __dict__(self):
        r"""
        Method '__dict__' in the class 'PeftConfigMixin' returns a dictionary representation of the object using the 'asdict' function.

        Args:
            self: The instance of the class. This parameter represents the object for which the dictionary representation is generated.

        Returns:
            None. The method does not return any value explicitly, as the dictionary representation is retrieved internally.

        Raises:
            No exceptions are explicitly raised by this method.
        """
        return asdict(self)

    def to_dict(self):
        """to dict"""
        return self.__dict__

    def save_pretrained(self, save_directory, **kwargs):
        r"""
        This method saves the configuration of your adapter model in a directory.

        Args:
            save_directory (`str`):
                The directory where the configuration will be saved.
            kwargs (additional keyword arguments, *optional*):
                Additional keyword arguments passed along to the 
                [`~transformers.utils.PushToHubMixin.push_to_hub`] method.
        """
        if os.path.isfile(save_directory):
            raise AssertionError(f"Provided path ({save_directory}) should be a directory, not a file")

        os.makedirs(save_directory, exist_ok=True)

        output_dict = asdict(self)
        # converting set type to list
        for key, value in output_dict.items():
            if isinstance(value, set):
                output_dict[key] = list(value)
        output_path = os.path.join(save_directory, CONFIG_NAME)

        # save it
        with open(output_path, "w", encoding='utf-8') as writer:
            writer.write(json.dumps(output_dict, indent=2, sort_keys=True))

    @classmethod
    def from_pretrained(cls, pretrained_model_name_or_path, subfolder=None, **kwargs):
        r"""
        This method loads the configuration of your adapter model from a directory.

        Args:
            pretrained_model_name_or_path (`str`):
                The directory or the Hub repository id where the configuration is saved.
            kwargs (additional keyword arguments, *optional*):
                Additional keyword arguments passed along to the child class initialization.
        """
        path = (
            os.path.join(pretrained_model_name_or_path, subfolder)
            if subfolder is not None
            else pretrained_model_name_or_path
        )
        # read config file
        if os.path.isfile(os.path.join(path, CONFIG_NAME)):
            config_file = os.path.join(path, CONFIG_NAME)
        else:
            raise ValueError(f"Can't find '{CONFIG_NAME}' at '{pretrained_model_name_or_path}'")

        loaded_attributes = cls.from_json_file(config_file)

        config = cls(**kwargs)

        for key, value in loaded_attributes.items():
            if hasattr(config, key):
                setattr(config, key, value)

        return config

    @classmethod
    def from_json_file(cls, path_json_file, **kwargs):
        r"""
        Loads a configuration file from a json file.

        Args:
            path_json_file (`str`):
                The path to the json file.
        """
        with open(path_json_file, "r", encoding='utf-8') as file:
            json_object = json.load(file)

        return json_object

    @property
    def is_prompt_learning(self):
        r"""
        Utility method to check if the configuration is for prompt learning.
        """
        return False

mindnlp.peft.config.PeftConfigMixin.__dict__ property

Method 'dict' in the class 'PeftConfigMixin' returns a dictionary representation of the object using the 'asdict' function.

PARAMETER DESCRIPTION
self

The instance of the class. This parameter represents the object for which the dictionary representation is generated.

RETURNS DESCRIPTION

None. The method does not return any value explicitly, as the dictionary representation is retrieved internally.

mindnlp.peft.config.PeftConfigMixin.is_prompt_learning property

Utility method to check if the configuration is for prompt learning.

mindnlp.peft.config.PeftConfigMixin.from_json_file(path_json_file, **kwargs) classmethod

Loads a configuration file from a json file.

PARAMETER DESCRIPTION
path_json_file

The path to the json file.

TYPE: `str`

Source code in mindnlp/peft/config.py
116
117
118
119
120
121
122
123
124
125
126
127
128
@classmethod
def from_json_file(cls, path_json_file, **kwargs):
    r"""
    Loads a configuration file from a json file.

    Args:
        path_json_file (`str`):
            The path to the json file.
    """
    with open(path_json_file, "r", encoding='utf-8') as file:
        json_object = json.load(file)

    return json_object

mindnlp.peft.config.PeftConfigMixin.from_pretrained(pretrained_model_name_or_path, subfolder=None, **kwargs) classmethod

This method loads the configuration of your adapter model from a directory.

PARAMETER DESCRIPTION
pretrained_model_name_or_path

The directory or the Hub repository id where the configuration is saved.

TYPE: `str`

kwargs

Additional keyword arguments passed along to the child class initialization.

TYPE: additional keyword arguments, *optional* DEFAULT: {}

Source code in mindnlp/peft/config.py
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
@classmethod
def from_pretrained(cls, pretrained_model_name_or_path, subfolder=None, **kwargs):
    r"""
    This method loads the configuration of your adapter model from a directory.

    Args:
        pretrained_model_name_or_path (`str`):
            The directory or the Hub repository id where the configuration is saved.
        kwargs (additional keyword arguments, *optional*):
            Additional keyword arguments passed along to the child class initialization.
    """
    path = (
        os.path.join(pretrained_model_name_or_path, subfolder)
        if subfolder is not None
        else pretrained_model_name_or_path
    )
    # read config file
    if os.path.isfile(os.path.join(path, CONFIG_NAME)):
        config_file = os.path.join(path, CONFIG_NAME)
    else:
        raise ValueError(f"Can't find '{CONFIG_NAME}' at '{pretrained_model_name_or_path}'")

    loaded_attributes = cls.from_json_file(config_file)

    config = cls(**kwargs)

    for key, value in loaded_attributes.items():
        if hasattr(config, key):
            setattr(config, key, value)

    return config

mindnlp.peft.config.PeftConfigMixin.save_pretrained(save_directory, **kwargs)

This method saves the configuration of your adapter model in a directory.

PARAMETER DESCRIPTION
save_directory

The directory where the configuration will be saved.

TYPE: `str`

kwargs

Additional keyword arguments passed along to the [~transformers.utils.PushToHubMixin.push_to_hub] method.

TYPE: additional keyword arguments, *optional* DEFAULT: {}

Source code in mindnlp/peft/config.py
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
def save_pretrained(self, save_directory, **kwargs):
    r"""
    This method saves the configuration of your adapter model in a directory.

    Args:
        save_directory (`str`):
            The directory where the configuration will be saved.
        kwargs (additional keyword arguments, *optional*):
            Additional keyword arguments passed along to the 
            [`~transformers.utils.PushToHubMixin.push_to_hub`] method.
    """
    if os.path.isfile(save_directory):
        raise AssertionError(f"Provided path ({save_directory}) should be a directory, not a file")

    os.makedirs(save_directory, exist_ok=True)

    output_dict = asdict(self)
    # converting set type to list
    for key, value in output_dict.items():
        if isinstance(value, set):
            output_dict[key] = list(value)
    output_path = os.path.join(save_directory, CONFIG_NAME)

    # save it
    with open(output_path, "w", encoding='utf-8') as writer:
        writer.write(json.dumps(output_dict, indent=2, sort_keys=True))

mindnlp.peft.config.PeftConfigMixin.to_dict()

to dict

Source code in mindnlp/peft/config.py
53
54
55
def to_dict(self):
    """to dict"""
    return self.__dict__

mindnlp.peft.config.PromptLearningConfig dataclass

Bases: PeftConfig

This is the base configuration class to store the configuration of [PrefixTuning], [PromptEncoder], or [PromptTuning].

PARAMETER DESCRIPTION
num_virtual_tokens

The number of virtual tokens to use.

TYPE: `int` DEFAULT: None

token_dim

The hidd-en embedding dimension of the base transformer model.

TYPE: `int` DEFAULT: None

num_transformer_submodules

The number of transformer subcells in the base transformer model.

TYPE: `int` DEFAULT: None

num_attention_heads

The number of attention heads in the base transformer model.

TYPE: `int` DEFAULT: None

num_layers

The number of layers in the base transformer model.

TYPE: `int` DEFAULT: None

Source code in mindnlp/peft/config.py
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
@dataclass
class PromptLearningConfig(PeftConfig):
    """
    This is the base configuration class to store the configuration of [`PrefixTuning`], [`PromptEncoder`], or
    [`PromptTuning`].

    Args:
        num_virtual_tokens (`int`): The number of virtual tokens to use.
        token_dim (`int`): The hidd-en embedding dimension of the base transformer model.
        num_transformer_submodules (`int`): The number of transformer subcells in the base transformer model.
        num_attention_heads (`int`): The number of attention heads in the base transformer model.
        num_layers (`int`): The number of layers in the base transformer model.
    """
    num_virtual_tokens: int = field(default=None, metadata={"help": "Number of virtual tokens"})
    token_dim: int = field(
        default=None, metadata={"help": "The hidden embedding dimension of the base transformer model"}
    )
    num_transformer_submodules: Optional[int] = field(
        default=None, metadata={"help": "Number of transformer subcells"}
    )
    num_attention_heads: Optional[int] = field(default=None, metadata={"help": "Number of attention heads"})
    num_layers: Optional[int] = field(default=None, metadata={"help": "Number of transformer layers"})
    @property
    def is_prompt_learning(self):
        r"""
        Utility method to check if the configuration is for prompt learning.
        """
        return True

mindnlp.peft.config.PromptLearningConfig.is_prompt_learning property

Utility method to check if the configuration is for prompt learning.