基于XGBoost与SHAP分析的可解释性故障诊断方法研究

Research on interpretable fault diagnosis method based on XGBoost and SHAP analysis

  • 摘要: 针对现有智能故障诊断方法存在特征输入单一、故障难以提取、模型可解释性较差等问题,提出一种基于XGBoost(extreme gradient boosting)与SHAP(SHapley Additive exPlanations)分析的可解释性故障诊断方法。首先,采用传统信号处理方法完成多域特征的提取。其次,基于XGBoost集成算法构建故障诊断模型,并根据XGBoost内嵌评估指标对模型进行初步特征解释。最后,运用Tree SHAP方法对诊断模型进行特征解释分析,探究重要特征对轴承故障类别趋势的影响关系,分析特征之间的依赖交互效应,直观、透明地揭示模型的诊断机制。通过实验对比XGBoost与其他传统机器学习方法,本模型在多维评价指标中综合表现更为突出,且具有较强的精确性,故障诊断准确率高达99.62%,具备良好的实际应用价值。

     

    Abstract: To solve the problems of existing intelligent fault diagnosis methods such as single feature input, difficulty in extracting faults, and poor model interpretability, an interpretable fault diagnosis method based on XGBoost (extreme gradient boosting) and SHAP (SHapley Additive exPlanations) analysis is proposed. Firstly, traditional signal processing methods are used to complete the extraction of multi-domain features. Secondly, a fault diagnosis model is constructed based on the XGBoost integrated algorithm, and a preliminary feature explanation of the model is performed based on the XGBoost embedded evaluation indicators. Finally, the Tree SHAP method is used to interpret and analyze the features of the diagnostic model, explore the influence of important features on the trend of bearing fault categories, analyze the dependence and interaction effects between features, and intuitively and transparently reveal the diagnostic mechanism of the model. Through experiments comparing XGBoost with other traditional machine learning methods, this model has more outstanding comprehensive performance in multi-dimensional evaluation indicators and has strong accuracy. The fault diagnosis accuracy rate is as high as 99.62%, which has good practical application value.

     

/

返回文章
返回