Show simple item record

dc.contributor.authorDevasahayam, Sheila
dc.date.accessioned2023-07-05T08:58:25Z
dc.date.available2023-07-05T08:58:25Z
dc.date.issued2023
dc.identifier.citationDevasahayam, S. 2023. Deep learning models in Python for predicting hydrogen production: A comparative study. Energy. 280: 128088.
dc.identifier.urihttp://hdl.handle.net/20.500.11937/92702
dc.identifier.doi10.1016/j.energy.2023.128088
dc.description.abstract

This study relates to predicting hydrogen production using deep learning models. The co-gasification of biomass and plastics dataset used gasification temperature, particle size of biomass rubber seed shell (RSS) and High-Density Polyethylene (HDPE), and the amount of plastic in the mixture as the independent variables, and the amount of hydrogen produced as the dependent variable. It was found that during the co-gasification particle size is a controlling factor for hydrogen production due to the influence on surface reactions, while temperature had no significant effect. The neural network models were developed using Keras and two different architectures were compared with and without L1 and L2 regularizers. The values for L1 and L2 are determined using the gridserach: for the 1 archtecture, the ideal L1 value = 0.010; and the ideal L2 value = 0.000001 and for the 2nd architecture, The ideal L1 value is 0.100; and the ideal L2 value is 0.000010 using the lowest mean squared error values for the test sets. The mean cross-validation scores indicated that the second architecture performed better. The mean cross_val_score using the negative mean square error, for the 1st architecture, with l2 regularizers (0.000001) is determined as −20.05 (13.10) nMSE for Kfold, 10; and for the 2nd architecture l2 regularizers (0.000010) as −8.22 (7.77) nMSE for Kfold, 10, indicate the 2nd architecture performs better. The best model parameters for both architectures were determined using Grid Search CV. The best model hyperparameters using Grid Search is batch_size, 3; epochs,100; optimizer, rmsprop for the first architechure with negative mean square error, −20.95; and for the 2nd architecture, batch_size, 5; epochs,100; optimizer, adam with negative mean square error, −7.38, indicating the 2nd architecture to be a better model. The Keras Wrapper improved the performance of the model for the first architecture, but not for the second architecture. The permutation feature importance for architecture 1 (in descending order) is: size of RSS, size of HDPE, per cent plastics in mixture and temperature. For architecture 2, in descending order: size of HDPE, size of RSS, per cent plastics in mixture and temperature. Overall, the study demonstrates the potential of deep learning models for predicting hydrogen production.

dc.languageEnglish
dc.publisherElsevier
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.titleDeep learning models in Python for predicting hydrogen production: A comparative study
dc.typeJournal Article
dcterms.source.volume280
dcterms.source.issn0360-5442
dcterms.source.titleEnergy
dc.date.updated2023-07-05T08:58:20Z
curtin.departmentWASM: Minerals, Energy and Chemical Engineering
curtin.accessStatusOpen access
curtin.facultyFaculty of Science and Engineering
curtin.contributor.orcidDevasahayam, Sheila [0000-0002-6250-7697]
curtin.identifier.article-number128088
curtin.contributor.scopusauthoridDevasahayam, Sheila [6602794932]
curtin.repositoryagreementV3


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

http://creativecommons.org/licenses/by/4.0/
Except where otherwise noted, this item's license is described as http://creativecommons.org/licenses/by/4.0/