Research Progress

AI Model Enhances Precision in Cast Blade Manufacturing

Apr 03,2026

Schematic Diagram of Robotic Grinding Processing for Blades (Image generated by AI)

Cast blades are critical core components in industries such as aerospace, energy, marine engineering, and gas turbines, where they are primarily used as turbine blades within engines. As the final step in the cast blade manufacturing process, grinding is essential for ensuring dimensional accuracy and surface quality.

Recently, a research team from the Manufacturing Equipment and Intelligent Robotics Department at the Shenyang Institute of Automation (SIA) of the Chinese Academy of Sciences, proposed a material removal depth prediction model, O-TabPFN, for robotic abrasive belt grinding processes. This model allows a robot to automatically adjust grinding process parameters based on the distribution of machining allowance across different areas of the blade, enabling precise point-by-point material removal and significantly improving machining accuracy and surface consistency. The study was published online on Month 7, 2026, in the international journal Precision Engineering.

Cast blades typically feature complex free-form surface structures with characteristics such as twisting, bending, and variable cross-sections. Traditional grinding methods often struggle to achieve uniform force application and consistent machining across the entire blade surface. Additionally, factors such as cooling shrinkage and mold deviations can lead to an uneven distribution of machining allowance. Therefore, adaptive adjustments based on the actual allowance are required during the grinding process.

Abrasive belt grinding is a key technology for machining complex-profile blades used in aero-engines and gas turbines. Current robotic abrasive belt grinding predominantly employs a constant-force grinding method, applying fixed process parameters to uniformly machine the entire workpiece surface. Although this approach offers advantages in process simplicity and efficiency, it faces challenges in practical applications, including insufficient machining accuracy, limited control over material removal, and inconsistent surface quality. Furthermore, the machining allowance distribution on the blade profile is uneven, accompanied by significant curvature variations.

To meet stringent machining quality requirements, precise control of material removal depth is necessary. Developing intelligent grinding methods capable of point-by-point precise removal, along with breakthroughs in key technologies for nonlinear material removal depth modeling, has become a critical challenge in the aviation industry. Accurately adjusting process parameters to optimize grinding precision and effectively establishing a mathematical relationship between process parameters and grinding depth are of great importance.

According to Associate Researcher ZHU Guang, a member of the team led by Researcher LI Lun from the Manufacturing Equipment and Intelligent Robotics Department, the research team constructed a material removal depth prediction model using TabPFN and then applied hyperparameter optimization with the Optuna algorithm to develop the O-TabPFN prediction model.

Its advantages primarily stem from the following: TabPFN incorporates prior knowledge through meta-learning, enabling efficient learning from limited tabular data without requiring large-scale model training. Additionally, hyperparameter optimization based on Optuna facilitates efficient global search, avoiding suboptimal parameter combinations and thereby enhancing model robustness and generalization capability. Consequently, the proposed O-TabPFN prediction model can more accurately capture the nonlinear relationship between process parameters and material removal depth while reducing the risk of local optima. These characteristics make it particularly well-suited for high-precision, data-efficient prediction tasks in robotic abrasive belt grinding.

AI model inferring the machining process (Image by SIA)

The researchers established a robotic grinding platform and conducted single-factor experiments on nickel-based superalloy specimens. Experimental results show that the model achieves a prediction accuracy of 95.81% for material removal depth, with an average prediction error of only 0.007316 mm, outperforming various existing mainstream prediction models.

This research was supported by the National Natural Science Foundation of China, the Natural Science Foundation of Liaoning Province, and other funding programs.

Appendix: