Adaption_Prompt
mindnlp.peft.tuners.adaption_prompt.config.AdaptionPromptConfig
dataclass
¶
Bases: PeftConfig
Stores the configuration of an [AdaptionPromptModel
].
Source code in mindnlp/peft/tuners/adaption_prompt/config.py
23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
|
mindnlp.peft.tuners.adaption_prompt.config.AdaptionPromptConfig.is_adaption_prompt: bool
property
¶
Return True if this is an adaption prompt config.
mindnlp.peft.tuners.adaption_prompt.config.AdaptionPromptConfig.__post_init__()
¶
This method is called automatically after the initialization of an instance of the 'AdaptionPromptConfig' class.
PARAMETER | DESCRIPTION |
---|---|
self |
An instance of the 'AdaptionPromptConfig' class.
|
RETURNS | DESCRIPTION |
---|---|
None. This method does not return any value. |
This method sets the 'peft_type' attribute of the 'AdaptionPromptConfig' instance to 'PeftType.ADAPTION_PROMPT'. The 'peft_type' attribute represents the type of the adaption prompt configuration.
Example
config = AdaptionPromptConfig() config.post_init() print(config.peft_type) # Output: PeftType.ADAPTION_PROMPT
Source code in mindnlp/peft/tuners/adaption_prompt/config.py
32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 |
|
mindnlp.peft.tuners.adaption_prompt.model.AdaptionPromptModel
¶
Bases: Module
Implements adaption prompts as described in https://arxiv.org/pdf/2303.16199.pdf.
The top L attention cells are replaced with AdaptedAttention cells that wrap the original ones, but insert trainable prompts with gates (for zero init).
Notes on the multi-adapter pattern: - We store the states of different adapters by keeping a dictionary of AdaptedAttention cells indexed by adapter name. - Every time we switch adapters, we remove the cells of the currently active adapter from the model, store them in the dictionary, and replace them with the cells of the new adapter. - To avoid duplicated and potentially inconsistent state, the currently active adapter is always removed from the dictionary. - Disabling the adapter would also result in the cells being removed from the model.
Source code in mindnlp/peft/tuners/adaption_prompt/model.py
26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 |
|
mindnlp.peft.tuners.adaption_prompt.model.AdaptionPromptModel.__getattr__(name)
¶
Forward missing attributes to the wrapped cell.
Source code in mindnlp/peft/tuners/adaption_prompt/model.py
174 175 176 177 178 179 180 181 |
|
mindnlp.peft.tuners.adaption_prompt.model.AdaptionPromptModel.__init__(model, configs, adapter_name)
¶
Initializes an instance of the AdaptionPromptModel class.
PARAMETER | DESCRIPTION |
---|---|
self |
The current instance of the class.
|
model |
The underlying model to be used for adaption prompts. Expected to be an object of a specific model class.
|
configs |
A dictionary containing configuration details for the adaption prompt model. - Type: Dict - Purpose: Specifies various configurations required for the adaption prompt model. - Restrictions: None
TYPE:
|
adapter_name |
The name of the adapter to be added. - Type: str - Purpose: Identifies the adapter which needs to be added to the adaption prompt model. - Restrictions: None
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
None |
Source code in mindnlp/peft/tuners/adaption_prompt/model.py
42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 |
|
mindnlp.peft.tuners.adaption_prompt.model.AdaptionPromptModel.add_adapter(adapter_name, config)
¶
Add an adapter with the given name and config.
Source code in mindnlp/peft/tuners/adaption_prompt/model.py
75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 |
|
mindnlp.peft.tuners.adaption_prompt.model.AdaptionPromptModel.disable_adapter_layers()
¶
Disable adapter layers by swapping out AdaptedAttention cells.
Source code in mindnlp/peft/tuners/adaption_prompt/model.py
125 126 127 128 |
|
mindnlp.peft.tuners.adaption_prompt.model.AdaptionPromptModel.enable_adapter_layers()
¶
Enable adapter layers by swapping in cached AdaptedAttention cells.
Source code in mindnlp/peft/tuners/adaption_prompt/model.py
120 121 122 123 |
|
mindnlp.peft.tuners.adaption_prompt.model.AdaptionPromptModel.set_adapter(adapter_name)
¶
Set the model to use the adapter with the given name.
Source code in mindnlp/peft/tuners/adaption_prompt/model.py
107 108 109 110 111 112 113 114 115 116 117 118 |
|