LumenScopeAI BrainTransformers

This revision is from 2024/10/18 01:31. You can Restore it.

Custom transfomer that comes with git clone: BrainTransformers-SNN-LLM/transformers/models/braingpt/configuration_braingpt.py

Add:

from transformers import PretrainedConfig

git clone https://github.com/LumenScopeAI/BrainTransformers-SNN-LLM.git

cd BrainTransformers-SNN-LLM/

python3 -m venv venv

source venv/bin/activate

pip install -r requirements.txt

sudo apt-get install git-lfs

git lfs install

git clone https://huggingface.co/LumenscopeAI/BrainTransformers-3B-Chat

Open run.py and change the model path to BrainTransformers-3B-Chat

BrainTransformers-SNN-LLM/transformers/models/braingpt/modeling_braingpt.py

def load_silu_approximator(device, dtype):

act_fn = SiLUApproximator().to(device).to(dtype)

pos_checkpoint = os.path.join(os.path.dirname(__file__), 'model_pos.pth')

neg_checkpoint = os.path.join(os.path.dirname(__file__), 'model_neg.pth')

if os.path.exists(pos_checkpoint) and os.path.exists(neg_checkpoint):

act_fn.pos_model.load_state_dict(

torch.load(pos_checkpoint, map_location=device, weights_only=True)

)

act_fn.neg_model.load_state_dict(

torch.load(neg_checkpoint, map_location=device, weights_only=True)

)

"""

act_fn.pos_model.load_state_dict(

torch.load(pos_checkpoint, map_location=device)

)

act_fn.neg_model.load_state_dict(

torch.load(neg_checkpoint, map_location=device)

)

"""

else:

raise FileNotFoundError(

f"SiLUApproximator parameters not found at {pos_checkpoint} and {neg_checkpoint}"

)

return act_fn

  

📝 📜 ⏱️ ⬆️