PLoS ONE

Public Library of Science

Fe-based superconducting transition temperature modeling by machine learning: A computer science method

DOI: 10.1371/journal.pone.0255823, Volume: 16, Issue: 8

Article Type: research-article,
Article History

Abstract

Searching for new high temperature superconductors has long been a key research issue. Fe-based superconductors attract researchers’ attention due to their high transition temperature, strong irreversibility field, and excellent crystallographic symmetry. By using doping methods and dopant levels, different types of new Fe-based superconductors are synthesized. The transition temperature is a key indicator to measure whether new superconductors are high temperature superconductors. However, the condition for measuring transition temperature are strict, and the measurement process is dangerous. There is a strong relationship between the lattice parameters and the transition temperature of Fe-based superconductors. To avoid the difficulties in measuring transition temperature, in this paper, we adopt a machine learning method to build a model based on the lattice parameters to predict the transition temperature of Fe-based superconductors. The model results are in accordance with available transition temperatures, showing 91.181% accuracy. Therefore, we can use the proposed model to predict unknown transition temperatures of Fe-based superconductors.

Superconductors with the zero resistance and the Meissner effect have significant practical application [1]. The best known application is in the Magnetic Resonance Imaging (MRI) systems widely employed by health care professionals for detailed internal body imaging. Other prominent applications include the magnetically levitated trains without friction and electrical power transmission with no energy loss [2–5]. However, superconductors have superconductivity only at or below their transition temperature [6], which hold back the wide spread application of superconductors.

Researchers have been conducting an extensive search for novel superconductors, especially those with high transition temperature. High temperature superconductors such as cuprate superconductors containing CuO_{2} planes [7–10], MgB_{2} [11], hydride superconductors under extreme pressure [12–18], and Fe-based superconductors [19]. In particular, Fe-based superconductors have high transition temperature next to cuprates, an upper critical field above 50T, a relatively strong irreversibility field, and a high crystallographic symmetry [20], which attract the attention of researchers. In the process of exploring the influencing factors of Fe-based superconducting transition temperature, a strong relationship between the transition temperature and the lattice parameters is found [21–26]. According to composition and crystal structure, Fe-based superconductors are divided into four categories, including ReFeAsO (Re = rare earth elements) (1111 system), AFe_{2}As_{2} (A = K, Sr, Ba, etc.) (122 system), LiFeAs (111 system), and FeSe (11 system).

At present, one of the main research directions of Fe-based superconductors is to improve their transition temperature via various doping methods and dopant levels [27,28]. The transition temperature is a key indicator to measure whether new superconductors are high temperature superconductors. However, the measurement of the transition temperature needs high precision devices including temperature controllers, constant current sources, and voltmeters, etc. These conditions cannot be achieved by ordinary laboratories. Meantime, it is necessary to artificially operate liquid nitrogen (77K) in the measurement process, and there are certain security risks. In addition, it mainly depends on liquid helium (4.2K) as refrigerant for superconductors that have strict temperature requirements. Because the equipment for liquefied helium is very complicated, and the liquid helium (4.2K) temperature is close to the absolute zero, the measurement of the transition temperature is very difficult.

Machine Learning (ML) is one branch of artificial intelligence while it is currently in the process of growth and evolution and is an active field in data science. One of the application of ML is data mining. In past decades, algorithms and theories corresponding to ML have had many advances, including the provision of useful data and robust computing infrastructures. Data mining is now rapidly applied to superconducting material science. Examples include using a Gaussian regression algorithm to predict physical parameters of superconductors [29–38]; using support vector regression [39], random forest algorithm [40], and XGBoost model [41] to predict high temperature superconductor candidates; and using GMDH-type neural network [42] to predict hysteresis loops of superconductors. The BP algorithm has excellent complex pattern classification and multi-dimensional function mapping capabilities, and it is applied in function fitting, data analysis and prediction. To avoid the strict measurement conditions and risk factors of the transition temperature measurement process, in this paper, we adopt a machine learning method to build a model based on the lattice parameters to predict the transition temperature of Fe-based superconductors.

The BP algorithm is a type of error back propagation algorithm. Error back propagation consists of forward propagation and feedback based on error signals. The forward propagation direction is input layer→hidden layer→output layer. The state of each layer of neurons only affects the state of the next layer of neurons. If the expected value is not obtained in the output layer, error signals will propagate back. The back propagation direction is output layer→hidden layer→input layer. By adjusting the weights and thresholds of each layer, the error will decrease along the negative gradient direction. The weights and thresholds are continuously iterated until the error meets the precision requirements. The algorithm process is shown in (Fig 1).

Assume a three-layer network with d-dimension input, l-dimension output, and q-dimension hidden layer, as shown in (Fig 2). In the network, the threshold of the j−th neuron in the output layer is *θ*_{j}, the threshold of the h−th neuron in the hidden layer is *γ*_{h}, the weight between the i−th neuron in the input layer and the h−th neuron in the hidden layer is *v*_{ih}, and the weight between the h−th neuron in the hidden layer and the j−th neuron in the output layer is *w*_{hj}.

The input of the h−th neuron in the hidden layer is:

$$

${\alpha}_{h}={\sum}_{i=1}^{d}{v}_{ih}{x}_{i}.$

The input of the j−th neuron in the output layer is:

$$

where,
${\beta}_{j}={\sum}_{h=1}^{q}{w}_{hj}{b}_{h},$

For a training example (x_{k}, y_{k}), the output of a neuron is
$$
${\stackrel{\u02dc}{\mathrm{y}}}_{\mathrm{k}}{=(\stackrel{\u02dc}{y}}_{1}^{k},{\stackrel{\u02dc}{y}}_{2}^{k},\dots ,{\stackrel{\u02dc}{y}}_{l}^{k})$
,

$$

where, f() is an activation function.
${\stackrel{\u02dc}{y}}_{j}^{k}=f({\beta}_{j}-{\theta}_{j}),$

The mean square error is:

$$

where,
$$
${y}_{j}^{k}$
is the actual value.
${E}_{k}=\frac{1}{2}{({\stackrel{\u02dc}{y}}_{j}^{k}-{y}_{j}^{k})}^{2},$

BP algorithm is an iterative algorithm, and the updating formula of parameter v is:

$$

$\mathrm{v}\leftarrow \mathrm{v}+\mathrm{\Delta}\mathrm{v}.$

The weight *w*_{hj} between hidden layer and output layer is:

$$

where, η is the learning rate.
$\mathrm{\Delta}{\mathrm{w}}_{\mathrm{h}\mathrm{j}}=-\mathrm{\eta}\frac{\partial {\mathrm{E}}_{\mathrm{k}}}{\partial {\mathrm{w}}_{\mathrm{h}\mathrm{j}}},$

According to the chain rule:

$$

$\frac{\partial {E}_{k}}{\partial {w}_{hj}}=\frac{\partial {E}_{k}}{{\partial \stackrel{\u02dc}{y}}_{j}^{k}}\bullet \frac{{\partial \stackrel{\u02dc}{y}}_{j}^{k}}{\partial {\beta}_{j}}\bullet \frac{\partial {\beta}_{j}}{\partial {w}_{hj}}.$

According to the definition of *β*_{j}:

$$

$\frac{\partial {\beta}_{j}}{\partial {w}_{hj}}={b}_{h}.$

According to formulas (3) and (4):

$$

${g}_{j}=-\frac{\partial {E}_{k}}{{\partial \stackrel{\u02dc}{y}}_{j}^{k}}\bullet \frac{{\partial \stackrel{\u02dc}{y}}_{j}^{k}}{\partial {\beta}_{j}}=-({\stackrel{\u02dc}{y}}_{j}^{k}-{y}_{j}^{k}){f}^{\prime}({\beta}_{j}-{\theta}_{j}).$

The updating formula of weights and thresholds is:

$$

$\mathrm{\Delta}{\mathrm{w}}_{\mathrm{h}\mathrm{j}}=\mathrm{\eta}{\mathrm{g}}_{\mathrm{j}}{\mathrm{b}}_{\mathrm{h}}.$

$$

$\mathrm{\Delta}{\mathrm{\theta}}_{\mathrm{j}}=-\mathrm{\beta}{\mathrm{g}}_{\mathrm{j}}.$

$$

$\mathrm{\Delta}{\mathrm{v}}_{\mathrm{i}\mathrm{h}}=\mathrm{\eta}{\mathrm{e}}_{\mathrm{h}}{\mathrm{x}}_{\mathrm{i}}.$

$$

$\mathrm{\Delta}{\mathrm{\gamma}}_{\mathrm{h}}=-\mathrm{\eta}{\mathrm{e}}_{\mathrm{h}}.$

In the formulas (12) and (13):

$$

${e}_{h}=-\frac{\partial {E}_{K}}{\partial {b}_{h}}\bullet \frac{\partial {b}_{h}}{\partial {\alpha}_{h}}={b}_{h}(1-{b}_{h}){\sum}_{j=1}^{l}{w}_{hj}{g}_{j}.$

By continuously iterating the weights *w*_{hj}, and *v*_{ih}, as well as the thresholds *θ*_{j}, and *γ*_{h}, the accuracy of the network will continue to improve. The performance of the trained network is evaluated by the mean absolute error (MAE), the root mean square error (RMSE), and the correlation coefficient (CC).

Data were obtained from Japan’s National Institute for Materials Science (NIMS) at http://supercon.nims.go.jp/index_en.html. After data processing, 203 sets of data were collected. To strengthen data relevance, the data only contained four representative types of Fe-based superconductors, namely 11, 111, 122, and 1111. The space distribution characteristic of data between the transition temperature and the lattice parameters is displayed in (Fig 3). The data were roughly divided into the 4 groups, corresponding to the 4 types of Fe-based superconductors. Each group has a certain degree of discreteness and a non-linear relationship, which meets the modeling requirements.

The visualization of transition temperature is shown in (Fig 4), which is discrete, and there is no aggregation. The data are distributed among 0–60 K, which is consistent with the transition temperature range of Fe-based superconductors. Statistical analysis of the transition temperature—including maximum, minimum, mean, variance, standard deviation (std), range, median, coefficient of variation, and skewness—is presented in (Table 1). The coefficient of variation is 70.03%, indicating the transition temperature has good dispersion. The skewness is greater than zero, indicating the data greater than the mean value are more scattered than the data less than the mean value.

Parameter | Data |
---|---|

Maximum |
56.5000 |

Minimum |
1.8000 |

Mean |
19.3666 |

Variance |
183.9457 |

Std |
13.5627 |

Range |
54.7000 |

Median |
14.4000 |

Coefficient of variation |
0.7003 |

Skewness |
1.1113 |

This paper divides the 203 sets of data into 2/3 training data and 1/3 testing data at random, and trains the model. The regression analysis between the actual transition temperature and the estimated transition temperature in the course of training the model are presented in (Fig 5) with accuracy of 91.181%. It shows a reasonable accuracy and powerful generalization. The performance of the model is shown in (Table 2). The MAE and CC are 0.47265, and 85.44%, respectively, representing closely matching performance and good prediction performance.

Parameter | Data |
---|---|

MAE |
0.47265 |

RMSE |
7.6863 |

CC |
0.8544 |

To further estimate on the prediction stability, the model performance measures through the 5 predictions for observation in (Table 3). It is found that all predictions generally maintain high accuracy from the training sample. The std of the MAE, RMSE, and accuracy are 0.0245, 0.05162, and 0.7057%, meaning that prediction errors are in a controllable range and that the model has a good prediction stability.

Parameter | MAE | RMSE | Accuracy |
---|---|---|---|

1st |
0.47265 | 7.6863 | 91.181% |

2st |
0.47845 | 8.4063 | 91.296% |

3st |
0.50231 | 7.1271 | 90.382% |

4st |
0.47675 | 8.3406 | 89.428% |

5st |
0.42716 | 8.4370 | 91.131% |

Minimum |
0.42716 | 7.1271 | 89.428% |

Maximum |
0.50231 | 8.4370 | 91.296% |

Mean |
0.471464 | 7.99946 | 90.6836% |

Median |
0.47675 | 8.3406 | 91.131% |

Std |
0.0245 | 0.5162 | 0.7057% |

In (Table 4), the performance of our BP model is compared with that based on two other models, the RF (Random Forest) [43] and the MLR (Multi-variable Linear Regression Regression) [44], in previous studies. It is found that our BP model has a optimal performance in terms of the CC and accuracy. In addition, our BP model is more straightforward from the perspective of computations and implementations than the others.

Model | CC | Accuracy |
---|---|---|

RF |
0.82 | 88.26% |

MLR |
0.84 | 88% |

BP |
0.8544 | 91.181% |

In order to identify the feasibility and validity of the new model, 10 Fe-based superconductors include the four kinds of Fe-based superconductors, whose transition temperature values are in a range of 4.1–53.5K, were selected from the literature [45–50] that are not included in the trained model as the data. We input the lattice parameters of every Fe-based superconductor into the model, and obtain the corresponding predictive transition temperature. The results are presented in (Table 5) and the visualization is shown in (Fig 6). The superconductors NaFeAs, SmFeAsO_{0.2}F_{0.8}, LaFePO, and LaOFeAs have a slightly larger error (1-2K), and the superconductors SmFeAsO_{0.93}F_{0.07}, Ba_{0.82}K_{0.18}Fe_{2}As_{2}, LiFeP, FeSe, and FeSe_{0.82} have a good accuracy (0.2–0.5K). The result shows that the model we build achieves an acceptable accuracy and we can measure the transition temperature of Fe-based superconductors based on the lattice parameters.

Elements | Lat.a | Lat.c | Tc | Prediction | References |
---|---|---|---|---|---|

SmFeAsO_{0.93}F_{0.07} |
3.393 | 8.482 | 35.0 | 34.4591 | [45] |

NaFeAs |
3.928 | 6.364 | 19.0 | 20.6925 | [45] |

Ba_{0.82}K_{0.18}Fe_{2}As_{2} |
3.937 | 13.155 | 25.4 | 25.0637 | [46] |

LiFeP |
3.692 | 6.031 | 6.0 | 6.2621 | [47] |

SmFeAsO_{0.2}F_{0.8} |
3.931 | 8.477 | 49.0 | 47.1806 | [48] |

LaFePO |
3.962 | 8.511 | 4.1 | 5.2642 | [49] |

LaOFeAs |
4.035 | 8.740 | 41.0 | 42.4102 | [49] |

LaOFeAs |
4.035 | 8.435 | 53.5 | 52.4721 | [49] |

FeSe |
3.770 | 5.521 | 9.5 | 9.1126 | [50] |

FeSe_{0.82} |
3.770 | 5.510 | 10.3 | 10.1835 | [50] |

In this paper, we used a machine learning method to predict the transition temperature of Fe-based superconductors based on the lattice parameters. By training BP algorithm, the acceptable accuracy of 91.181% was obtained in the model with available data. We made the performance measurement for estimating the model stability, and the model errors were in a controllable range. We used the trained model to predict, and the predictive value is close to the actual value. Those suggest that the model is capable of estimating the transition temperature of Fe-based superconductors with reasonable accuracy and therefore is recommended for predicting the transition temperature of Fe-based superconductors.

This research did not receive any specific grants from funding agencies in the public, commercial, or not-for-profit sectors.

1

DA Cardwell, DS Ginley. Handbook of Superconducting Materials: Characterization, applications and cryogenics. Institute of Physics, (2003).

2

F Ben Azzouz
et al
. Structure, microstructure and transport properties of B-doped YBCO system. Physical C, 442(2006),13–19.
doi:
doi: 10.1016/j.physc.2006.03.135

3

Su H, DO Welch. The effects of space charge, dopants, and strain fields on surfaces and grain boundaries in YBCO compounds. Supercond. Sci. Technol., 18(2005),24–34.
doi:
doi: 10.1088/0953-2048/18/1/005

4

D Volochova
et al
. Time dependent changes in Ag doped YBCO superconductors. Acta Physica Polonica A, 118(2010),1047–1048.
doi:
doi: 10.12693/APhysPolA.118.1047

5

P Paturia, H Palonen, H Huhtinen. Properties of Pr- and BZO-doped YBCO multilayers. Physics Procedia, 36(2012),661–664.
doi:
doi: 10.1016/j.phpro.2012.06.263

6

PF Dahl. Kamerlingh Onnes and the discovery of superconductivity: The leyden years, 1911–1914. Hist. Stud. Phys. Sci. 15,1–37 (1984).

7

H Maeda, Y Tanaka, M Fukutomi, T Asano. A new high-Tc oxide superconductor without a rare earth element. Japanese. J. Appl. Phys. 27,L209(1988).

8

Y Wang, J Zheng, Z Zhu, M Zhang, W Yuan. Quench behavior of high-temperature superconductor (RE)Ba2Cu3OxCORC cable. J.Phys.D:Appl.Phys. 52(34),345303(2019).

9

P Yang, Y Wang, D Qiu, T Chang, H Ma. J Zhu,
et al
. Design and fabrication of a 1-MW high-temperature superconductor DC induction heater. IEEE Trans. Appl. Supercond. 29(5),1–6(2019).

10

P Yang, K Li, Y Wang, L Wang, Q Wu, A Huang,
et al
. Quench protection system of a 1-MW high-temperature superconductor DC induction heater. IEEE Trans. Appl. Supercond. 29(5),1–6(2019).

11

J Nagamatsu, N Nakagawa, T Muranaka, Y Zenitani, J Akimitsu. Superconductivity at 39 K in magnesium diboride. Nature 410,63–64(2001).
doi:
doi: 10.1038/35065039

12

AP Durajski and R Szczesniak. Supercond. Sci. Technol. 27,115012(2014).

13

D Duan, Y Liu, F Tian, D Li, X Huang, Z Zhao,
et al
. Nat. Sci. Rep. 4,696 (2014).

15

I Errea, M Calandra, CJ Pickard, J Nelson, RJ Needs, Y Li,
et al
. Phys. Rev. Lett. 114,157004(2015).
doi:
doi: 10.1103/PhysRevLett.114.157004

16

DY Kim, RH Scheicher, and R Ahuja. Phys. Rev. Lett. 103,077022(2009).

17

18

19

Nagamatsu, T Watanabe, M Hirano, H Hosono. Iron-based layered superconductor La[O1-xFx]FeAs (x = 0.05–0.12) with Tc = 26K. J.Am.Chem.Soc. 130(11),3296–3297(2008).
doi:
doi: 10.1021/ja800073m

20

Yun Zhang, Xiaojie Xu. Fe-based Superconducting Transition Temperature Modeling through Gaussian Process Regression. Journal of Low Temperature Physics.
doi:
doi: 10.10007/s10909-020-0253-9

21

DJ Scalapino. Superconductivity, Marcel Dekker, 1969.

22

PW Anderson. Physica C 185(1911)11.

23

DJ Scalapino. Physica C 235(1994)107.

24

D Pines. Physica C 235(1994)280.

25

S S Chakravarty Kivelson. Europhys. Lett. 16(1991)751.

26

RB Laughlin. Physica C 234(1994) 280.

27

K. Watanabe: Effect of anion concentration in substitution for in the Bi-Pb-Sr-Ca-Cu-O (2223-phase) system superconductor. Supercond. Sci. Technol. 11(9), 843(1998).

28

Z. Tang, S J. Wang, X H Gao.,
et al
. Evidence for charge transfer in Bi-based superconductors studied by positron annihilation. Phys. Lett. A 17 (3–4),320–324(1993).

29

Y Zhang, X Xu. Predicting doped MgB2 superconductor critical temperature from lattice parameters using Gaussian process regression. Phys. C: Supercond. Appl. 573,1353633(2020).

30

Y Zhang, X Xu. Curie temperature modeling of magnetocaloric lanthanum manganites using Gaussian process regression. J. Magn. Magn. Mater. 512,166998(2020).

31

Y Zhang, X Xu. Machine learning the magnetocaloric effect in manganites from compositions and structural parameters. AIP Adv. 10(3),035220(2020).

32

Y Zhang, X Xu. Predicting the thermal conductivity enhancement of nanofluids using computational intelligence. Phys. Lett. A 384, 126500(2020).

33

Y Zhang, X Xu. Machine learning modeling of lattice constants for half-Heusler alloys. AIP Adv. 10,045121(2020).

34

Y Zhang, X Xu. Relative cooling power modeling of lanthanum manganites using Gaussian process regression. RSC Adv. 10, 20646–20653(2020).
doi:
doi: 10.1038/s41598-020-77678-8

35

Y Zhang, X Xu. Machine learning band gaps of doped-TiO2 photocatalysts from structural and morphological parameters. ACS Omega 5, 15344–15352(2020).
doi:
doi: 10.1021/acsomega.0c01438

36

Y Zhang, X Xu. Machine learning lattice constants for cubic perovskite ABX_{3} compounds. Chemistry Select 5,9999–10009 (2020).

37

R Juneja, G Yumnam, S Satsangi, AK Singh. Coupling the high-throughput property map to machine learning for predicting lattice thermal conductivity. Chem. Mater. 31(14), 5145–5151(2019).

38

R Juneja, AK Singh. Guided patchwork kriging to develop highly transferable thermal conductivity prediction models. J. Phys.: Mater. 3(2),024006(2020).

39

V Vapnik. The Nature of Statistical Learning Theory (Springer-Verlag, New York,1995).

40

L Breiman.”Random forests,” Mach. Learn. 45, 5(2001).

41

K Hamidieh. A data-driven statistical model for predicting the critical temperature of a superconductor. Computational Materials Science 154(2018)346–354.
doi:
doi: 10.1016/j.commatsci.2018.07.052

42

T Akram, SR Naqvi, SA Haider, M Kamran. A novel framework for approximation of magneto-resistance curves of a superconducting film using GMDH-type neural networks. Superlattices and Microstructures 145(2020)106635.
doi:
doi: 10.1016/j.spmi.2020.106635

43

Z Alizadeh, MR Mohammadizadeh. Predicting electron-phonon coupling constants of superconducting elements by machine learning. Physica C: Superconductivity and its applications 558(2019)7–11.
doi:
doi: 10.1013/j.phy.2018.12.008

44

C.Z. Cai, Wen Zhu, J.F Pei.,
et al
. Predicting the superconducting transition temperature Tc of BiPbSrCaCuOF superconductors by using support vector regression. J. Supercond. Nov. Magn. 23(5),737–740(2010).

45

46

P Dai, J Hu, E Dagotto. Magnetism and its microscopic origin in iron-based high-temperature superconductors. Nat.Phy. 8(10), 709–718(2012).

47

Y Maeno, H Hashimoto, K Yoshida, S Nishizaki, T Fujita, JG Bednorz,
et al
. Superconductivity in a layered perovskite without copper. Nature 372(6506), 532–534(1994).

48

ZA Ren, W Lu, J Yang, W Yi, XL Shen, ZC Li, et al. Superconductivity at 55K in iron-based F-doped layered quaternary compound Sm[O1-xFx]FeAs. arXiv preprint arXiv: 0804.2053, (2008).

49

MA McGuire, AD Christianson, AS Sefat, R Jin, EA Payzant, BC Sales, et al. Evidence for the spin density wave in LaFeAsO. arXiv-0804 (2008).

50

I Yamada, AA Belik, M Azuma, S Harjo, T Kamiyama, Y Shimakawa,
et al
. Single-layer oxychloride superconductor Ca_{2-x}CuO_{2}CI_{2} with A-site cation deficiency. Phys. Rev. B 72(22),224503(2005).

https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Citing articles via

Tweets

https://www.researchpad.co/tools/openurl?pubtype=article&doi=10.1371/journal.pone.0255823&title=Fe-based superconducting transition temperature modeling by machine learning: A computer science method&author=Zhiyuan Hu,Mahendra Singh Dhaka,&keyword=&subject=Research Article,Physical Sciences,Materials Science,Materials,Superconductors,Physical Sciences,Materials Science,Material Properties,Transition Temperature,Biology and Life Sciences,Cell Biology,Cellular Types,Animal Cells,Neurons,Biology and Life Sciences,Neuroscience,Cellular Neuroscience,Neurons,Physical Sciences,Mathematics,Applied Mathematics,Algorithms,Machine Learning Algorithms,Research and Analysis Methods,Simulation and Modeling,Algorithms,Machine Learning Algorithms,Computer and Information Sciences,Artificial Intelligence,Machine Learning,Machine Learning Algorithms,Computer and Information Sciences,Artificial Intelligence,Machine Learning,Medicine and Health Sciences,Diagnostic Medicine,Diagnostic Radiology,Magnetic Resonance Imaging,Research and Analysis Methods,Imaging Techniques,Diagnostic Radiology,Magnetic Resonance Imaging,Medicine and Health Sciences,Radiology and Imaging,Diagnostic Radiology,Magnetic Resonance Imaging,Physical Sciences,Chemistry,Chemical Elements,Helium,Computer and Information Sciences,Data Management,Data Mining,

© 2023 Newgen KnowledgeWorks | Privacy & Cookie Policy | Powered by: Nova