File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Alternative Approach to Chemical Accuracy: A Neural Networks-Based First-Principles Method for Heat of Formation of Molecules Made of H, C, N, O, F, S, and Cl

TitleAlternative Approach to Chemical Accuracy: A Neural Networks-Based First-Principles Method for Heat of Formation of Molecules Made of H, C, N, O, F, S, and Cl
Authors
Issue Date2014
Citation
Journal of Physical Chemistry A, 2014 How to Cite?
AbstractThe neural network correction approach that was previously proposed to achieve the chemical accuracy for first-principles methods is further developed by a combination of the Kennard–Stone sampling and Bootstrapping methods. As a result, the accuracy of the calculated heat of formation is improved further, and moreover, the error bar of each calculated result can be determined. An enlarged database (Chen/13), which contains a total of 539 molecules made of the common elements H, C, N, O, F, S, and Cl, is constructed and is divided into the training (449 molecules) and testing (90 molecules) data sets with the Kennard–Stone sampling method. Upon the neural network correction, the mean absolute deviation (MAD) of the B3LYP/6-311+G(3df,2p) calculated heat of formation is reduced from 10.92 to 1.47 kcal mol–1 and 14.95 to 1.31 kcal mol–1 for the training and testing data sets, respectively. Furthermore, the Bootstrapping method, a broadly used statistical method, is employed to assess the accuracy of each neural-network prediction by determining its error bar. The average error bar for the testing data set is 1.05 kcal mol–1, therefore achieving the chemical accuracy. When a testing molecule falls into the regions of the “Chemical Space” where the distribution density of the training molecules is high, its predicted error bar is comparatively small, and thus, the predicted value is accurate as it should be. As a challenge, the resulting neural-network is employed to discern the discrepancy among the existing experimental data.
Persistent Identifierhttp://hdl.handle.net/10722/202532
ISSN
2021 Impact Factor: 2.944
2020 SCImago Journal Rankings: 0.756
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorSun, Jen_US
dc.contributor.authorWU, Jen_US
dc.contributor.authorSong, Ten_US
dc.contributor.authorHu, LHen_US
dc.contributor.authorShan, Ken_US
dc.contributor.authorChen, Gen_US
dc.date.accessioned2014-09-19T08:40:52Z-
dc.date.available2014-09-19T08:40:52Z-
dc.date.issued2014en_US
dc.identifier.citationJournal of Physical Chemistry A, 2014en_US
dc.identifier.issn1089-5639-
dc.identifier.urihttp://hdl.handle.net/10722/202532-
dc.description.abstractThe neural network correction approach that was previously proposed to achieve the chemical accuracy for first-principles methods is further developed by a combination of the Kennard–Stone sampling and Bootstrapping methods. As a result, the accuracy of the calculated heat of formation is improved further, and moreover, the error bar of each calculated result can be determined. An enlarged database (Chen/13), which contains a total of 539 molecules made of the common elements H, C, N, O, F, S, and Cl, is constructed and is divided into the training (449 molecules) and testing (90 molecules) data sets with the Kennard–Stone sampling method. Upon the neural network correction, the mean absolute deviation (MAD) of the B3LYP/6-311+G(3df,2p) calculated heat of formation is reduced from 10.92 to 1.47 kcal mol–1 and 14.95 to 1.31 kcal mol–1 for the training and testing data sets, respectively. Furthermore, the Bootstrapping method, a broadly used statistical method, is employed to assess the accuracy of each neural-network prediction by determining its error bar. The average error bar for the testing data set is 1.05 kcal mol–1, therefore achieving the chemical accuracy. When a testing molecule falls into the regions of the “Chemical Space” where the distribution density of the training molecules is high, its predicted error bar is comparatively small, and thus, the predicted value is accurate as it should be. As a challenge, the resulting neural-network is employed to discern the discrepancy among the existing experimental data.en_US
dc.languageengen_US
dc.relation.ispartofJournal of Physical Chemistry Aen_US
dc.titleAlternative Approach to Chemical Accuracy: A Neural Networks-Based First-Principles Method for Heat of Formation of Molecules Made of H, C, N, O, F, S, and Clen_US
dc.typeArticleen_US
dc.identifier.emailSun, J: sunj@hku.hken_US
dc.identifier.emailSong, T: songtaoo@hku.hken_US
dc.identifier.emailShan, K: klshan@hku.hken_US
dc.identifier.emailChen, G: ghc@yangtze.hku.hken_US
dc.identifier.authorityChen, G=rp00671en_US
dc.identifier.doi10.1021/jp502096yen_US
dc.identifier.pmid24979488-
dc.identifier.scopuseid_2-s2.0-84907782453-
dc.identifier.hkuros235500en_US
dc.identifier.eissn1520-5215-
dc.identifier.isiWOS:000342651200027-
dc.identifier.issnl1089-5639-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats