首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到1条相似文献,搜索用时 0 毫秒
1.
One of the main factors that affects the performance of MLP neural networks trained using the backpropagation algorithm in mineral-potential mapping isthe paucity of deposit relative to barren training patterns. To overcome this problem, random noise is added to the original training patterns in order to create additional synthetic deposit training data. Experiments on the effect of the number of deposits available for training in the Kalgoorlie Terrane orogenic gold province show that both the classification performance of a trained network and the quality of the resultant prospectivity map increasesignificantly with increased numbers of deposit patterns. Experiments are conducted to determine the optimum amount of noise using both uniform and normally distributed random noise. Through the addition of noise to the original deposit training data, the number of deposit training patterns is increased from approximately 50 to 1000. The percentage of correct classifications significantly improves for the independent test set as well as for deposit patterns in the test set. For example, using ±40% uniform random noise, the test-set classification performance increases from 67.9% and 68.0% to 72.8% and 77.1% (for test-set overall and test-set deposit patterns, respectively). Indices for the quality of the resultant prospectivity map, (i.e. D/A, D × (D/A), where D is the percentage of deposits and A is the percentage of the total area for the highest prospectivity map-class, and area under an ROC curve) also increase from 8.2, 105, 0.79 to 17.9, 226, 0.87, respectively. Increasing the size of the training-stop data set results in a further increase in classification performance to 73.5%, 77.4%, 14.7, 296, 0.87 for test-set overall and test-set deposit patterns, D/A, D × (D/A), and area under the ROC curve, respectively.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号