Machine learning for predicting laser ablation groove characteristics in polycrystalline diamond

This paper explores machine learning’s role in predicting laser-machined micro-groove texture on Polycrystalline Diamond (PCD) surfaces. PCD has been used for manufacturing ideal cutting insert due to its exceptional attributes, including hardness and thermal conductivity. Surface micro-texturing enhances accuracy and tool lifespan through micro-textures on tool surfaces. Laser micromachining, especially for its precision and efficiency, stands out among methods. Six regression models—Elastic Net, Random Forest, Gradient Boosting Regression, XGBoost Regression, Bayesian Regression, and Gaussian Process Regression—are used to predict groove depth and width based on laser parameters like energy, defocus, and speed. Experiments involve a nanosecond laser system and a commercial PCD tool. Results indicate both Gradient Boosting and XGBoost excel in predicting micro-groove texture. XGBoost slightly outperforms, credited to its enhancements over Gradient Boosting. This paper concludes that machine learning models, especially XGBoost and Gradient Boosting, effectively forecast micro-groove features on laser-machined PCD surfaces, offering insights for further research and practical applications in this domain.


Introduction
Surface microtexturing technology is an important surface engineering technique that focuses on studying minute structural changes on tool surfaces.Research has shown that preparing microstructures on specific tool surfaces can yield multiple benefits.Firstly, it effectively reduces friction and wear, making the tools more durable.Secondly, microstructures provide additional storage space for cutting lubricants, aiding in better lubrication of the cutting process, reducing energy losses, and wear.Additionally, it also helps control the flow of chips during machining, enhancing machining accuracy and efficiency [1] [2].
The methods for preparing microstructures on PCD cutting tool surfaces include laser micromachining, electrical discharge machining, chemical etching, ion implantation, and mechanical machining.Among these, laser micromachining is considered an economically efficient method due to its advantages such as high precision, non-contact processing, and good controllability.Laser micromachining can create complex microtexture structures, and by adjusting various laser process parameters, it is possible to precisely fabricate intricate structures [3][4].However, it's important to note that laser micromachining often requires iterative experimentation to fine-tune the laser process parameters to achieve the desired surface structure.This trial-and-error approach is necessary to optimize the process for specific materials and desired outcomes, ensuring that the microstructures meet the required specifications and quality standards, which consumes a significant amount of time, effort, and cost [5].However, these issues can be addressed by integrating machine learning methods.Machine learning can analyze a vast amount of machining data and laser process parameters, learn from them, and establish models for predicting the optimal process parameter combinations to achieve the desired microstructure.
In this study, regression modeling estimated laser-machined micro-groove depth and width on PCD surfaces.To predict accurately, various machine learning models were explored, focusing on non-linear ones: Elastic Net, Random Forest, Gradient Boosting, XGBoost, Bayesian, and Gaussian Process Regression.These models aimed to understand complex connections between laser parameters and micro-groove texture.By comparing model performance, the most suitable type was identified.
This effort seeks to utilize regression modeling and industrial-oriented machine learning.The objective is to enhance laser-machined micro-groove fabrication on PCD surfaces, gaining insights to optimize precise texture creation and broader engineering applications.

Machine learning model
In this research, six different regression models were chosen, including Elastic Net, Random Forest, Gradient Boosting Regression, XGBoost Regression, Bayesian Regression, and Gaussian Process Regression.These machine learning models utilized parameters such as laser monopulse energy, defocus amount, and scanning speed as inputs, with the aim of constructing models closely related to the depth and width of microgroove textures.By learning from experimental data, these models would be capable of capturing the intricate relationships between process parameters and microgroove texture characteristics.This will enable researchers to accurately predict the characteristics of microgrooves based on different process parameter configurations in practical applications, thereby optimizing the preparation process of PCD microgroove textures using laser machining.

Experimental Details
Figure 1 illustrates the schematic of the nanosecond laser machining system used in the experiments.The workbench is capable of movement in three directions: the x-axis, y-axis, and z-axis.This enables precise positioning for texture machining at different locations on the workpiece (PCD tools) and allows for both positive and negative defocusing machining.The observed microscopic morphology after processing is shown in the Figure 3 (a).It can be observed that irradiating PCD material with different laser energies results in inconsistent microscopic morphologies.When the laser energy is low, the depth and width of the micro-grooves are relatively small, consistent with the physical characteristics of nanosecond laser ablation.However, it is noted that with increasing laser energy, a 'necking' phenomenon emerges in the formed micro-groove textures, as seen in Figure 3 (b) B2, B6, B3, and B5.The primary reason for this is that during the interaction between nanosecond laser and PCD material, the laser ablation causes material melting, vaporization, and is accompanied by the generation of plasma and explosions.This leads to the ejection of molten material outward, with some of it coming into contact with the inner wall of the pit.This cooling process results in a recast layer, causing the 'necking' effect.As a result, the micro-grooves are not straight up and down, posing challenges for width measurement.To establish a consistent measurement standard, we chose the depth range from plane B1 to B7 up to the 'necking' (expected necking) at B2 to B6.We then added the midline P1 to P2 of the two and took the average of the sum of three widths as the width measurement value.The calculation formula is as follows:

Influence of Laser Parameters on Micro-Groove Texture
Plot all experimental results, including three independent variables and two dependent variables, into an interactive three-dimensional scatter plot as described, as shown in Figure 4.The varying shades of color on the graph will indicate the impact of laser parameters on depth and width.

Model Hyperparameter Settings
In this section, we conducted hyperparameter optimization for each regression model.We used 80% of the total data for training machine learning models, reserving the remaining 20% for testing the model's performance.Hyperparameters play a critical role in the field of machine learning as they significantly impact model performance and the training process.By optimizing hyperparameters, we can find the best parameter combinations tailored to a specific problem, thereby improving model performance and generalization ability.Grid search is a commonly used hyperparameter tuning method that explores different combinations of hyperparameters to find the optimal configuration [6].This process involves defining the hyperparameter ranges, generating hyperparameter combinations, conducting model training and evaluation, and ultimately selecting the configuration with the best performance.The discovered optimal hyperparameter values are then used in the predictive modeling of machine learning algorithms.

Comparison and Analysis of Model Performance
Cross-Validation is a statistical method used to assess the performance of machine learning models.Its principle involves dividing the dataset into multiple subsets and then repeatedly using these subsets for model training and validation.This approach helps overcome the randomness and variability that a single data split might introduce, providing a more stable and reliable assessment of model performance, while also providing a more accurate estimation of how the model will perform on unseen data [7].
In this study, we employed the 5-fold cross-validation method (K=5).This approach effectively leverages the information in the dataset and provides a stable assessment of model performance.By taking turns using each subset as a validation set, we can effectively avoid issues of overfitting or underfitting on specific data splits.This ensures a more reliable evaluation of the model's ability to generalize well to unseen data, guaranteeing excellent performance on data it has not encountered before.It is evident that for micro-groove width, the minimum MSE, MAE, and RMSE belong to Gradient Boosting and XGBoost.Similarly, for micro-groove depth, the minimum MSE, MAE, and RMSE also belong to Gradient Boosting and XGBoost.
In selecting the best model, XGBoost slightly outperforms Gradient Boosting.Gradient Boosting and XGBoost (Extreme Gradient Boosting) are both ensemble learning algorithms in machine learning.XGBoost's slight advantage may stem from it being a specific implementation and extension of Gradient Boosting.It is developed as an ensemble learning algorithm based on Gradient Boosting, optimizing and improving its performance and efficiency.XGBoost introduces innovative techniques such as regularization, custom loss functions, handling missing values, feature selection, and optimization strategies for the gradient boosting process [8].
For the prediction of width, as shown in the Figure 6, some models like Random Forest and Gradient Process Regression exhibit larger deviations between predicted and actual values.From this, it can be observed that the prediction results based on the Gradient Boosting and XGBoost models are also highly accurate.This aligns with the outcomes reflected by cross-validation metrics such as MSE, MAE, RMSE, and Score.
For the prediction of depth, as shown in the Figure 7, most of the points align along the trend line.It's evident that the prediction results based on the Gradient Boosting and XGBoost models closely match the actual values.It indicates that Gradient Boosting and XGBoost models are well-suited for this study.

Conclusion
In this study, we conducted laser slotting experiments on the surface of PCD materials and utilized various regression models for machine learning to predict the input-output relationship of laser slotting.
In the comparison of model performance analysis, we employed cross-validation methods to assess the models' effectiveness.Based on the results of cross-validation, we found that in the prediction tasks of slot width and depth, both Gradient Boosting and XGBoost performed remarkably well, and XGBoost exhibited a slight advantage over Gradient Boosting in predicting slot width and depth.This could be attributed to XGBoost's role as an extension of Gradient Boosting, incorporating further optimizations and improvements in terms of performance and efficiency.These findings provide valuable references and insights for future research and applications.

Figure 1 .
Figure 1.(A) Schematic diagram of the nanosecond laser processing system; (B) Triaxial workbench; (C) Optical path diagram of nanosecond laser system.

Figure 4 .
Figure 4.The influence of laser parameters on width (a) and depth (b).

6
The cross-validation results are based on Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and R² Score, as shown in the Figure5.

Figure 6 .
Figure 6.Actual and predicted values of Width.(a) to (f) are: Elastic Net, Random Forest, Gradient Boosting, XGBoost, Bayesian Regression, and Gaussian Process Regression.

Figure 7 .
Figure 7. Actual and predicted values of Depth.(a) to (f) are: Elastic Net, Random Forest, Gradient Boosting, XGBoost, Bayesian Regression, and Gaussian Process Regression.

TABLE I .
Experimental parameter setting