首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 140 毫秒
1.
粒子群优化适线法在水文频率分析中的应用   总被引:2,自引:0,他引:2  
水文频率分析就是给各种水利工程提供具有概率含义的水文设计数据,以确定工程的规模、投资和效益.传统的水文频率计算方法一般都是假定总体服从P-Ⅲ型分布,然后采用适线法根据样本估计参数,进而推求设计值.因此,在概率分布线型确定的前提下,水文频率计算实际上就是根据样本资料估算其中包含的参数.从充分利用样本信息、增强参数估计方法的精确性和减少人为因素影响等方面考虑,本文提出了基于粒子群优化算法的优化适线法,并将该方法应用到年最大流量频率分析中.为了进一步了解该方法的统计特性,还将其与传统参数估计方法(矩法、权函数法、概率权重矩法)作了比较.实例表明,该方法能快速的完成参数寻优过程,并较好的寻找出参数的全局最优解.  相似文献   

2.
江竹  宋文武 《水文》2013,33(1):74-78
为了克服经典水位流量关系模型在刻画河流动态变化特性时所存在的局限性,提出采用局部加权回归算法估计模型参数;为了提高参数估计精度以及流量的计算效率,提出一种聚类树加权回归方法.首先对训练样本进行聚类,然后使用k-最近邻方法将新的水位样本划分进最恰当的聚类中,最后估计河流日流量.该方法在估计过程中,避免了不相关信息的干扰,从而提高了日流量数据估计的效率和精度.利用某水文站的实测数据对方法进行测试,仿真结果表明,方法估计精度高,为水位流量关系模型参数估计提供了新的有效方法.  相似文献   

3.
岩土样本力学参数的模糊统计特征研究   总被引:12,自引:5,他引:7  
徐卫亚  蒋中明 《岩土力学》2004,25(3):342-346
岩土工程参数中的各种信息都存在模糊性。通过对岩土样本力学参数试验值的研究,提出当分析样本统计特征时,应该充分考虑模糊不确定性因素对参数特征值的影响。在确定的参数隶属函数基础上,推导了岩土样本力学参数模糊统计特征值的计算公式,并证明了公式的合理性。应用实例表明,由于文中提出的计算方法考虑了参数模糊性的影响,其计算结果比按常规统计方法得到的结果更具合理性。在任何情况下,模糊方差和模糊变异系数都比常规统计方法得到的方差和变异系数小。分析结果表明,该方法具有广泛的推广价值。  相似文献   

4.
研究区大量分布中风化柱状叠层石灰岩, 岩石强度具有很强的变异性和区域性, 而岩石强度是影响边坡稳定的重要因素。要进行边坡稳定性评价, 就需要对研究区现场试验数据作为随机变量进行概率统计分析, 获取可靠的岩石强度指标。本文通过对研究区内中风化柱状叠层石灰岩( t23 ) 现场点荷载实验数据进行统计分析, 利用x2 检验进行拟和, 得到岩石强度的概率密度函数和概率分布形式, 抗压强度和抗拉强度均服从对数正态分布。在此基础上, 结合相近岩性岩石强度的拟和结果, 建立区内中风化柱状叠层石灰岩点荷载试验强度概率分布函数。运用Bayes估计推断区域内中风化柱状叠层石灰岩岩石点荷载强度, 得到岩石抗压强度预测值为72. 42 MPa, 抗拉强度预测值为2. 29 MPa, 相对误差较小, 样本信任度达到70%以上。该方法对于岩石强度估计是有效的, 由该方法所得到的估计值对研究区边坡稳定性评价以及后期边坡治理都具有重要的实践意义。  相似文献   

5.
韩冰  王芝银 《岩土力学》2008,29(10):2661-2666
基于室内岩石三轴蠕变试验资料对岩石的流变本构模型及流变参数进行辨识,是研究岩石流变力学特性的重要手段。由于受到试验条件和随机噪声的影响,试验数据中经常夹杂有异常值,采用传统的最小二乘法进行参数估计其结果往往会出现较大偏差。为了克服异常值对参数辨识结果的影响,引入统计学中的稳健估计理论,通过对实测的样本资料赋予不同的权重,采用双二次加权最小二乘法对流变参数进行估计。对比试验和计算结果表明,该方法能够有效地减小异常值对辨识结果的影响,在估值精度、收敛性、鲁棒性方面均优于传统的最小二乘估计方法。  相似文献   

6.
水文频率分析计算过程中,水文极值样本系列容量一般都较小、代表性不高,使得水文设计值估计具有不确定性。利用Bootstrap方法,研究样本抽样不确定性对水文设计值的影响。与传统水文频率分析方法相比,基于Bootstrap方法不仅可提供设计值的点估计和区间估计,同时能够对设计值的不确定性进行定量评价。此外,基于Bootstrap技术,结合矩法、权函数法及线性矩法,设置3套方案,分析了该方法在不同参数估计方法间的有效性。以南通市1970-2011年共42年的年降雨量数据资料为例,对所提方法进行实例应用分析,结果表明,从期望设计值、90%置信区间及最终设计值角度而言,基于所提方法的设计成果受参数估计方法的选取影响不大,且可回避规范中B值诺莫图通用性较差及误差显著问题。  相似文献   

7.
李晓军  张振远 《岩土力学》2014,35(10):2881-2887
因为克里金插值方法考虑了样本的空间统计特征而被广泛应用于地层建模。但对于地层尖灭、缺失等不连续地层情况,常用的克里金方法(如普通克里金方法)通常给出较为平滑的地层厚度估计结果,在地层分布边缘的局部范围内与实际情况相差较大。针对不连续地层,提出一种采用指示克里金和普通克里金相结合的地层厚度估计方法。该方法首先采用指示克里金方法估计地层分布范围,然后采用普通克里金方法估计分布范围内的地层厚度,并根据估计标准差来评价地层厚度的不确定性。将提出的方法应用于上海长江隧道的地层建模,通过交叉验证,证明了该方法在地层分布边缘的局部范围内能明显改善地厚度估计结果,标准差降低了15%~18%,得到了比普通克里金和线性插值方法更接近真实情况的地层厚度估计值。  相似文献   

8.
《岩土力学》2017,(12):3555-3564
某一特定岩土场地的试验数据、监测资料和观测信息等通常十分有限,然而贝叶斯方法却可充分利用有限的场地信息克服试验数据样本量较小的不足。为有效估计有限样本条件下参数统计特征,提出了基于结构可靠度方法和贝叶斯更新(BUS)的边坡可靠度更新方法,通过融入直剪试验数据更新无限长边坡可靠度验证了提出方法的有效性,并系统探讨了岩土体参数先验信息如试验样本量、概率分布和似然函数模型对边坡可靠度更新的影响规律。结果表明:BUS方法能够考虑岩土体参数概率分布和似然函数模型的影响,融入有限的场地信息准确地估计参数统计特征和更新边坡可靠度,为解决有限样本条件下边坡可靠度更新问题提供了一条有效的途径。土体参数概率分布对边坡可靠度更新结果(参数后验均值、标准差以及更新的失效概率)具有重要的影响,基于常用的正态和对数正态分布的边坡可靠度更新结果偏于保守,相比之下,似然函数模型对边坡可靠度更新结果的影响相对较小。此外,岩土体参数不确定性和更新的边坡失效概率均随着试验样本量的增大而减小,但当样本量增大到一定程度时它们的变化不大。  相似文献   

9.
为了提高检潮仪表的测量精度和测量的稳定性,文章探讨了适合于检测仪表的数据融合方法.针时单传感器的检测仪表,采用分批估计和统计相结合的融合方法;对于多传感嚣的检测系统,主要讨论了加权平均和神经网络两种数据融合方法.最后用神经网络方法采处理多压力传感器构成的检测系统的数据,并进行了效果分析.通过应用实例,验证了数据融合方法在检测仪表中提高测量精度和稳定性所起的作用.  相似文献   

10.
线性矩法估计参数的保证修正值系数B的推求   总被引:3,自引:0,他引:3  
刘攀  郭生练  胡安焱 《水文》2006,26(6):27-29
在设计洪水的抽样误差估计中,采用保证修正值系数B方法估计均方差简单易用。应用线性矩法估计频率分布曲线的参数,有较好的无偏性和有效性,具有一定的应用前景。目前线性矩法的B值诺模图尚未确定,给用线性矩法确定参数时估计抽样误差带来了难度与不便。采用统计试验,分析了离差系数、偏态系数和样本长度对B值的影响,结果表明线性矩法估计参数时仍可采用保证修正值系数B值诺模图方法来估计抽样误差,并推求了不同偏态系数、设计频率情形下的B值,制成了线性矩法的B值诺模图备用。  相似文献   

11.
The effect of outliers on estimates of the variogram depends on how they are distributed in space. The ‘spatial breakdown point’ is the largest proportion of observations which can be drawn from some arbitrary contaminating process without destroying a robust variogram estimator, when they are arranged in the most damaging spatial pattern. A numerical method is presented to find the spatial breakdown point for any sample array in two dimensions or more. It is shown by means of some examples that such a numerical approach is needed to determine the spatial breakdown point for two or more dimensions, even on a regular square sample grid, since previous conjectures about the spatial breakdown point in two dimensions do not hold. The ‘average spatial breakdown point’ has been used as a basis for practical guidelines on the intensity of contaminating processes that can be tolerated by robust variogram estimators. It is the largest proportion of contaminating observations in a data set such that the breakdown point of the variance estimator used to obtain point estimates of the variogram is not exceeded by the expected proportion of contaminated pairs of observations over any lag. In this paper the behaviour of the average spatial breakdown point is investigated for cases where the contaminating process is spatially dependent. It is shown that in two dimensions the average spatial breakdown point is 0.25. Finally, the ‘empirical spatial breakdown point’, a tool for the exploratory analysis of spatial data thought to contain outliers, is introduced and demonstrated using data on metal content in the soils of Sheffield, England. The empirical spatial breakdown point of a particular data set can be used to indicate whether the distribution of possible contaminants is likely to undermine a robust variogram estimator.  相似文献   

12.
Samples from hazardous waste site investigations frequently come from two or more statistical populations. Assessment of background levels of contaminants can be a significant problem. This problem is being investigated at the U.S. Environmental Protection Agency's Environmental Monitoring Systems Laboratory in Las Vegas. This paper describes a statistical approach for assessing background levels from a dataset. The elevated values that may be associated with a plume or contaminated area of the site are separated from lower values that are assumed to represent background levels. It would be desirable to separate the two populations either spatially by Kriging the data or chronologically by a time series analysis, provided an adequate number of samples were properly collected in space and/or time. Unfortunately, quite often the data are too few in number or too improperly designed to support either spatial or time series analysis. Regulations typically call for nothing more than the mean and standard deviation of the background distribution. This paper provides a robust probabilistic approach for gaining this information from poorly collected data that are not suitable for above-mentioned alternative approaches. We assume that the site has some areas unaffected by the industrial activity, and that a subset of the given sample is from this clean part of the site. We can think of this multivariate data set as coming from two or more populations: the background population, and the contaminated populations (with varying degrees of contamination). Using robust M-estimators, we develop a procedure to classify the sample into component populations. We derive robust simultaneous confidence ellipsoids to establish background contamination levels. Some simulated as well as real examples from Superfund site investigations are included to illustrate these procedures. The method presented here is quite general and is suitable for many geological and biological applications.  相似文献   

13.
A model is presented for estimating the value of information of sampling programs for contaminated soil. The purpose is to calculate the optimal number of samples when the objective is to estimate the mean concentration. A Bayesian risk–cost–benefit decision analysis framework is applied and the approach is design-based. The model explicitly includes sample uncertainty at a complexity level that can be applied to practical contaminated land problems with limited amount of data. Prior information about the contamination level is modelled by probability density functions. The value of information is expressed in monetary terms. The most cost-effective sampling program is the one with the highest expected net value. The model was applied to a contaminated scrap yard in Göteborg, Sweden, contaminated by metals. The optimal number of samples was determined to be in the range of 16–18 for a remediation unit of 100 m2. Sensitivity analysis indicates that the perspective of the decision-maker is important, and that the cost of failure and the future land use are the most important factors to consider. The model can also be applied for other sampling problems, for example, sampling and testing of wastes to meet landfill waste acceptance procedures.  相似文献   

14.
纪忠华  王璐  路雨 《水文》2017,37(4):6-11
以淮河紫罗山子流域出口日平均流量数据为研究对象,基于超阈值(POT)模型,采用最大似然法估计广义Pareto(GP)分布参数并计算出重现期水平和相应的置信区间范围。拟合优度检验结果显示POT模型在扩大洪水样本提高使用效率的同时,对样本经验点据的适线性也较好。通过对5种时段长度的水文实测流量数据重现期计算发现:实测数据长度对重现期计算结果不确定性有重要影响,在工程水文中推荐选取恰当的置信区间上界作为设计值加以解决。  相似文献   

15.
分期设计洪水标准计算方法研究   总被引:3,自引:0,他引:3  
邹鹰 《水文》2007,27(2):54-56
针对水库分期设计和调度运用所涉及的分期情况下如何定义水库防洪标准,以及如何在不突破水库防洪标准的前提下合理确定分期设计洪水标准等关键问题,本文基于概率论中独立事件的概率组合原理,提出了水库分期运用情况下的水库防洪标准的等价表达形式,并以一个水库作为分析案例,在确保水库分期运用情况下的等价防洪标准不突破水库防洪标准的前提下,分别确定了各个分期设计洪水标准以及相应的汛限水位。  相似文献   

16.
雷晓云  何春梅 《水文》2004,24(4):5-8
针对新疆一些河流实测洪水系列较短、传统的统计模型进行风险估计难以胜任的特点,引入了信息扩散理论的模糊数学方法,提出了洪水风险评估的实用模型,并以新疆阿克苏河流域新大河暴雨融雪型洪水为例,进行洪水风险估计,结果令人满意。  相似文献   

17.
The geometric average is often used to estimate the effective (large-scale) permeability from smaller-scale samples. In doing so, one assumes that the geometric average is a good estimator of the geometric mean. Problems with this estimator arise, however, when one or more of the samples has a very low value. The estimate obtained becomes very sensitive to the small values in the sample set, while the true effective permeability may be only weakly dependent on these small values. Several alternative methods of estimating the geometric mean are suggested. In particular, a more robust estimator of the geometric mean, the jth Winsorized mean, is proposed and several of its properties are compared with those of the geometric average.  相似文献   

18.
岩石抗剪强度参数的稳健估计   总被引:2,自引:0,他引:2  
张飞  赵玉仑 《岩土力学》1999,20(1):53-56
提出了M-稳健估计计算岩体抗剪强度参数C,f值的计算模型和方法。通过对白云鄂博主矿原位试验数据的稳健估计计算结果与最小二乘法的计算结果的对比分析表明:采用稳健估计的方法所估计出的参数更可靠。  相似文献   

19.
Raman spectroscopy of carbonaceous material (RSCM) is frequently used to determine metamorphic peak temperatures from the structural order of carbonaceous material enclosed in metasediments. This method provides a quick, robust and relatively cheap geothermometer. However, the comparability of the RSCM parameter is low as there are at least three major sources of biasing factors. These sources are the spectral curve‐fitting procedure, the sample characteristics itself and the experimental design including the used Raman system. To assess the impacts of the biasing factors on RSCM, a series of experiments was performed. The experiments showed that curve‐fitting is strongly influenced by individual operator‐bias and the degrees of freedom in the model, implying the need for a standardised curve‐fitting procedure. Due to the diversity of components (optics, light detection device, gratings, etc.) and their combinations within the Raman systems, different Raman instruments generally give differing results. Consequently, to estimate comparable metamorphic temperatures from RSCM data, every Raman instrument needs its own calibration. This demands a reference material series that covers the entire temperature calibration range. Although sample heterogeneity will still induce some variation, a reference material series combined with standardised curve‐fitting procedures will significantly increase the overall comparability of RSCM data from different laboratories.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号