在人工智能技术飞速发展的今天,为了传承和发扬中医药的精髓,腾讯烟台新工科研究院团队经过深入调研和与中医药学院合作,成功研发出一款基于QWEN的智能中医问诊大模型——“五行心智”。这款模型利用了十万条数量级的问诊数据进行微调,旨在为用户提供更加精准、个性化的中医问诊服务。 “五行心智”智能中医问诊大模型,顾名思义,是将中医五行理论融入人工智能技术,通过深度学习、自然语言处理等技术,实现对患者病症的智能分析、诊断和推荐治疗方案。相较于传统的人工问诊方式,智能中医问诊大模型具有以下优势: 1,高效便捷:患者只需在手机、电脑等设备上输入自己的症状,系统即可在短时间内给出诊断结果和治疗方案,节省了患者排队、等待的时间。 2,精准个性化:基于大数据和深度学习技术,智能中医问诊大模型能够根据患者的年龄、性别、病史等信息,提供更加精准、个性化的诊断和建议。 3,全天候服务:智能中医问诊大模型可以随时随地为患者提供问诊服务,不受时间和地点的限制,满足了患者在不同场景下的需求。 4,智能推荐:在给出诊断结果的同时,智能中医问诊大模型还会根据患者的实际情况,推荐合适的中医治疗方案,包括中药、针灸、推拿等,帮助患者更好地恢复健康。 5,持续学习:智能中医问诊大模型具有持续学习的能力,随着数据的不断积累和技术的迭代更新,其诊断准确率和治疗效果将不断提高。 “五行心智”智能中医问诊大模型的问世,标志着我国中医智能化发展迈出了重要的一步。我们相信,在不久的将来,这款模型将为广大患者带来更加便捷、高效的中医问诊体验,助力中医药事业的发展。同时,我们也期待与更多行业同仁携手合作,共同推动中医药智能化进程,为人类健康事业作出更大贡献。 大模型包含1.8B版本,7B版本和14B版本,本次开源的为1.8B版本,并且使用gptq量化为it4,支持在消费级别的显卡上运行。 方便起见,我们建议您克隆qwe的原始存储库完成依赖的安装 其次,请确保您安装了autogptq 附加的,如果您的硬件支持,您可以使用flash-attetio来提升推理效率 之后即可进行推理 大模型部署请参考qwe官方文档,建议使用vidia trio+vllm的部署方式进行部署,可以获得三十倍以上的推理加速提升。 让我们共同期待“五行心智”智能中医问诊大模型在中医药领域的广泛应用,为传承和发扬我国传统医学文化贡献力量! I the era where artificial itelligece techology is advacig rapidly, to iherit ad promote the essece of traditioal Chiese medicie (TCM), the Tecet Yatai New Egieerig Istitute team, after i-depth research ad collaboratio with TCM colleges, has successfully developed a itelliget TCM cosultatio large model based o QWEN - "Five Phases Midset". This model utilizes a scale of oe hudred thousad-level cosultatio data for fie-tuig, aimig to provide users with more accurate ad persoalized TCM cosultatio services. The "Five Phases Midset" itelliget TCM cosultatio large model, as the ame suggests, itegrates the TCM Five Phases theory ito artificial itelligece techology. Through deep learig, atural laguage processig, ad other techologies, it achieves itelliget aalysis, diagosis, ad recommedatio of treatmet plas for patiets' symptoms. Compared to traditioal maual cosultatio methods, the itelliget TCM cosultatio large model has the followig advatages: 1,Efficiet ad Coveiet: Patiets oly eed to iput their symptoms o devices such as mobile phoes or computers, ad the system ca quickly provide diagostic results ad treatmet plas, savig patiets' time spet i queues ad waitig. 2,Accurate ad Persoalized: Based o big data ad deep learig techology, the itelliget TCM cosultatio large model ca provide more accurate ad persoalized diagoses ad suggestios accordig to the patiet's age, geder, medical history, ad other iformatio. 3,Roud-the-Clock Service: The itelliget TCM cosultatio large model ca provide cosultatio services aytime ad aywhere, without time ad locatio costraits, meetig the eeds of patiets i differet scearios. 4,Itelliget Recommedatios: Alogside the diagostic results, the itelliget TCM cosultatio large model also recommeds suitable TCM treatmet plas based o the patiet's actual situatio, icludig Chiese medicie, acupucture, massage, etc., helpig patiets recover better. 5,Cotiuous Learig: The itelliget TCM cosultatio large model has the ability to cotiuously lear. As data accumulates ad techology iterates, its diagostic accuracy ad treatmet effectiveess will cotiue to improve. The advet of the "Five Phases Midset" itelliget TCM cosultatio large model marks a importat step forward i the itelliget developmet of TCM i our coutry. We believe that i the ear future, this model will brig more coveiet ad efficiet TCM cosultatio experieces to a wide rage of patiets, cotributig to the developmet of the TCM idustry. At the same time, we also look forward to collaboratig with more idustry colleagues to joitly promote the itelliget process of TCM ad make greater cotributios to the cause of huma health The large model icludes versios with 1.8B, 7B, ad 14B parameters. The versio beig ope-sourced this time is the 1.8B versio, which has bee quatized usig GPTQ to it4, eablig it to ru o cosumer-grade graphics cards. For coveiece, we recommed that you cloe the origial QWEN repository to complete the istallatio of depedecies. Secodly, please esure that you have istalled Additioally, if your hardware supports it, you ca use flash-attetio to ehace the efficiecy of iferece. Afterward, you ca proceed with the iferece. For large model deploymet, please refer to the official QWEN documetatio. It is recommeded to use the NVIDIA Trito + VLLM deploymet method, which ca achieve over thirty times acceleratio i iferece speed. Let us look forward to the widespread applicatio of the "Five Phases Midset" itelliget TCM cosultatio large model i the field of TCM, cotributig to the iheritace ad promotio of our traditioal medical culture!引领中医智能化新篇章——“五行心智”智能中医问诊大模型
如何使用
git cloe https://github.com/QweLM/Qwe.git
cd Qwe
pip istall -r requiremets.txt
pip istall auto-gptq optimum
git cloe https://github.com/Dao-AILab/flash-attetio
pip uistall -y ija && pip istall ija
cd flash-attetio && pip istall .
from trasformers import AutoModelForCausalLM, AutoTokeizer
from trasformers.geeratio import GeeratioCofig
tokeizer = AutoTokeizer.from_pretraied("cookey39/Five_Phases_Midset", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretraied("cookey39/Five_Phases_Midset", device_map="auto", trust_remote_code=True).eval()
# 可指定不同的生成长度、top_p等相关超参
model.geeratio_cofig = GeeratioCofig.from_pretraied("cookey39/Five_Phases_Midset", trust_remote_code=True)
# 设置温度参数,用于控制生成文本的多样性
model.geeratio_cofig.temperature = 0.6
# 对话需要遵循问诊格式:你是一位经验丰富中医医生,会根据患者的症状给出诊断和药方/症状:
# 例如
respose, _ = model.chat(tokeizer, "你是一位经验丰富中医医生,会根据患者的症状给出诊断和药方/症状:感冒发热流鼻涕", history=Noe)
prit(respose)
部署
Leadig the New Chapter i Itelliget TCM - "Five Phases Midset" Itelliget TCM Cosultatio Large Model
How to Use
git cloe https://github.com/QweLM/Qwe.git
cd Qwe
pip istall -r requiremets.txt
autogptq.pip istall auto-gptq optimum
git cloe https://github.com/Dao-AILab/flash-attetio
pip uistall -y ija && pip istall ija
cd flash-attetio && pip istall .
from trasformers import AutoModelForCausalLM, AutoTokeizer
from trasformers.geeratio import GeeratioCofig
tokeizer = AutoTokeizer.from_pretraied("cookey39/Five_Phases_Midset", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretraied("cookey39/Five_Phases_Midset", device_map="auto", trust_remote_code=True).eval()
# You ca specify differet hyperparameters such as geeratio legth ad top_p for the model.
model.geeratio_cofig = GeeratioCofig.from_pretraied("cookey39/Five_Phases_Midset", trust_remote_code=True)
# Set the temperature parameter to cotrol the diversity of the geerated text.
model.geeratio_cofig.temperature = 0.6
# The dialogue should follow the cosultatio format.:你是一位经验丰富中医医生,会根据患者的症状给出诊断和药方/症状:
# For example:
respose, _ = model.chat(tokeizer, "你是一位经验丰富中医医生,会根据患者的症状给出诊断和药方/症状:感冒发热流鼻涕", history=Noe)
prit(respose)
Deploymet
点击空白处退出提示










评论