OneLLM: One Framework to Align All Modalities with Language
[Project Page] [Paper]
News
- 2023.12.01 Release model weights and inference code.?
Contents
TODO
- [ ] Data
- [ ] Evaluation
- [ ] Training
Install
- Clone the repo into a local folder.
git clone https://github.com/csuhan/OneLLM
cd OneLLM
- Install packages.
conda create -n onellm python=3.9 -y
conda activate onellm
pip install -r requirements.txt
# install pointnet
cd model/lib/pointnet2
python setup.py install
- Install Apex. (Optional)
git clone https://github.com/NVIDIA/apex
cd apex
pip install -v --disable-pip-version-check --no-cache-dir --no-build-isolation --config-settings "--build-option=--cpp_ext" --config-settings "--build-option=--cuda_ext" ./
Demo
Local Demo: Assume you have downloaded the weights to ${WEIGHTS_DIR}. Then run the following command to start a gradio demo locally.
python demos/multi_turn_mm.py --gpu_ids 0 --tokenizer_path config/llama2/tokenizer.model --llama_config config/llama2/7B.json --pretrained_path ${WEIGHTS_DIR}/consolidated.00-of-01.pth
Citation
@article{han2023onellm,
title={OneLLM: One Framework to Align All Modalities with Language},
author={Han, Jiaming and Gong, Kaixiong and Zhang, Yiyuan and Wang, Jiaqi and Zhang, Kaipeng and Lin, Dahua and Qiao, Yu and Gao, Peng and Yue, Xiangyu},
journal={arXiv preprint arXiv:2312.03700},
year={2023}
}
Acknowledgement
LLaMA, LLaMA-Adapter, LLaMA2-Accessory, Meta-Transformer, ChatBridge
License
This project is developed based on Llama 2, please refer to the LLAMA 2 Community License.
评论