Modelling and Computational Experiment to Obtain Optimized Neural Network for Battery Thermal Management Data
The focus of this work is to computationally obtain an optimized neural network (NN) model to predict battery average Nusselt number (<i>Nu<sub>avg</sub></i>) data using four activations functions. The battery <i>Nu<sub>avg</sub></i> is highly nonlinea...
Guardado en:
Autores principales: | , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
MDPI AG
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/01bfa23bd5b34175abe2316eb1f1c428 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | The focus of this work is to computationally obtain an optimized neural network (NN) model to predict battery average Nusselt number (<i>Nu<sub>avg</sub></i>) data using four activations functions. The battery <i>Nu<sub>avg</sub></i> is highly nonlinear as reported in the literature, which depends mainly on flow velocity, coolant type, heat generation, thermal conductivity, battery length to width ratio, and space between the parallel battery packs. <i>Nu<sub>avg</sub></i> is modeled at first using only one hidden layer in the network (NN<sub>1</sub>). The neurons in NN<sub>1</sub> are experimented from 1 to 10 with activation functions: Sigmoidal, Gaussian, Tanh, and Linear functions to get the optimized NN<sub>1</sub>. Similarly, deep NN (NN<sub>D</sub>) was also analyzed with neurons and activations functions to find an optimized number of hidden layers to predict the <i>Nu<sub>avg</sub></i>. RSME (root mean square error) and R-Squared (R<sup>2</sup>) is accessed to conclude the optimized NN model. From this computational experiment, it is found that NN<sub>1</sub> and NN<sub>D</sub> both accurately predict the battery data. Six neurons in the hidden layer for NN<sub>1</sub> give the best predictions. Sigmoidal and Gaussian functions have provided the best results for the NN<sub>1</sub> model. In NN<sub>D,</sub> the optimized model is obtained at different hidden layers and neurons for each activation function. The Sigmoidal and Gaussian functions outperformed the Tanh and Linear functions in an NN<sub>1</sub> model. The linear function, on the other hand, was unable to forecast the battery data adequately. The Gaussian and Linear functions outperformed the other two NN-operated functions in the NN<sub>D</sub> model. Overall, the deep NN (NN<sub>D</sub>) model predicted better than the single-layered NN (NN<sub>1</sub>) model for each activation function. |
---|