Filter

ai-models-infra

Large Model Training and Inference Optimization Platform for Indigenous Intelligent Computing Power

September 14

10:15 - 10:50

Location: Venue 4 - 338

This presentation introduces Qingcheng.ai's design and development of a training and inference optimization platform for indigenous large models, with improving indigenous intelligent computing power as the core application scenario. The platform adopts the self-developed Chitu inference engine, designs a unified framework for training and fine-tuning, and proposes hybrid precision and low-bit quantization optimization schemes. While maintaining computational accuracy and general problem-solving capabilities, it significantly improves model computational speed and reduces computational resource consumption through engineering and product practices.

Speakers