首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   830篇
  免费   135篇
  国内免费   140篇
测绘学   118篇
大气科学   61篇
地球物理   308篇
地质学   324篇
海洋学   122篇
天文学   12篇
综合类   46篇
自然地理   114篇
  2024年   3篇
  2023年   11篇
  2022年   40篇
  2021年   42篇
  2020年   48篇
  2019年   48篇
  2018年   43篇
  2017年   47篇
  2016年   48篇
  2015年   41篇
  2014年   52篇
  2013年   66篇
  2012年   39篇
  2011年   49篇
  2010年   37篇
  2009年   49篇
  2008年   51篇
  2007年   56篇
  2006年   47篇
  2005年   37篇
  2004年   27篇
  2003年   27篇
  2002年   31篇
  2001年   21篇
  2000年   21篇
  1999年   20篇
  1998年   10篇
  1997年   10篇
  1996年   6篇
  1995年   9篇
  1994年   8篇
  1993年   11篇
  1992年   10篇
  1991年   11篇
  1990年   4篇
  1989年   9篇
  1988年   4篇
  1987年   4篇
  1986年   3篇
  1984年   1篇
  1982年   3篇
  1981年   1篇
排序方式: 共有1105条查询结果,搜索用时 328 毫秒
151.
Markov chain Monte Carlo algorithms are commonly employed for accurate uncertainty appraisals in non-linear inverse problems. The downside of these algorithms is the considerable number of samples needed to achieve reliable posterior estimations, especially in high-dimensional model spaces. To overcome this issue, the Hamiltonian Monte Carlo algorithm has recently been introduced to solve geophysical inversions. Different from classical Markov chain Monte Carlo algorithms, this approach exploits the derivative information of the target posterior probability density to guide the sampling of the model space. However, its main downside is the computational cost for the derivative computation (i.e. the computation of the Jacobian matrix around each sampled model). Possible strategies to mitigate this issue are the reduction of the dimensionality of the model space and/or the use of efficient methods to compute the gradient of the target density. Here we focus the attention to the estimation of elastic properties (P-, S-wave velocities and density) from pre-stack data through a non-linear amplitude versus angle inversion in which the Hamiltonian Monte Carlo algorithm is used to sample the posterior probability. To decrease the computational cost of the inversion procedure, we employ the discrete cosine transform to reparametrize the model space, and we train a convolutional neural network to predict the Jacobian matrix around each sampled model. The training data set for the network is also parametrized in the discrete cosine transform space, thus allowing for a reduction of the number of parameters to be optimized during the learning phase. Once trained the network can be used to compute the Jacobian matrix associated with each sampled model in real time. The outcomes of the proposed approach are compared and validated with the predictions of Hamiltonian Monte Carlo inversions in which a quite computationally expensive, but accurate finite-difference scheme is used to compute the Jacobian matrix and with those obtained by replacing the Jacobian with a matrix operator derived from a linear approximation of the Zoeppritz equations. Synthetic and field inversion experiments demonstrate that the proposed approach dramatically reduces the cost of the Hamiltonian Monte Carlo inversion while preserving an accurate and efficient sampling of the posterior probability.  相似文献   
152.
由于地震孕育过程的复杂性和观测技术的局限性,不同地震观测资料表现出异常变化与后续较大地震的对应关系存在不确定性,因此对预测意见进行概率表达是一种科学恰当的做法。本文基于泊松分布的危险区背景地震概率预测和单项预测方法(包括测震、流体、形变、电磁等学科)的历史预测效能,采用贝叶斯定理计算得到单项预测方法的短期或年度地震危险概率预测结果,进而采用综合概率方法,给出基于多种单项预测方法的短期或年度地震危险概率预测结果。短期概率预测初步结果表明,2018年2~9月,中国大陆72%的5级以上地震都位于相对高概率预测区域。  相似文献   
153.
王建  柏春广  徐永辉 《沉积学报》2006,24(4):562-569
通过对江苏中部淤泥质潮滩潮汐纹层发育过程的实地观测,以及对所采样品在室内进行的落淤量的计算与颗粒的粒度、磨圆度分析等,揭示出江苏中部海岸淤泥质潮滩沉积中的潮汐层理的成因机理:毫米级薄砂泥互层层理为半日潮的产物,厘米级厚的厚砂泥互层层理为半月天文潮的产物。此外,通过对9711号台风引致的风暴潮影响前后的滩面进行了现场观测,发现江苏中部淤泥质潮滩上部存在一个即使在风暴潮期间也不发生侵蚀的地带。该带的风暴沉积与正常的潮滩沉积比较,具有粒度较粗,分选较差,磨圆稍好,递变现象明显,平行层理或波状交错层理发育比较典型的特征。  相似文献   
154.
Discard management needs to draw on scientific research and advice, usually supported by specific statistical modeling analysis. A wide range of statistical analysis methods were applied to fishery data in an attempt to distinguish factors that influence the species discard composition. While such approaches are important, they are still incomplete for disaggregating the economic and spatial-temporal factors for analyzing of this process and obtain a whole view of this issue. Our study aims to fill this gap by identifying, describing, and quantifying factors that influence discards of trawl fisheries using a multivariate approach based on five complementary aspects: “economic”, “vessel characteristics”, “spatial”, “temporal” and “environmental”. In addition, a spatial multi-criteria approach were used to investigate discard hot-spot areas using ecological criteria such as vulnerability and resilience of the discarded species. Using these ecological criteria will concentrate conservation efforts on the most relevant sites minimizing discards of a variety of potentially vulnerable species. This approach was applied to a case study of a multi-species demersal bottom trawl fisheries in north Spain, Cantabrian Sea (ICES area VIIIc). Results showed how spatial and economic factors highly affect species discard composition, identifying specific spatial-temporal discard hot-spots to be preferentially avoided by fishers. Mitigation measures for future fisheries management strategies should be implemented at multiple stages of the discarding process, both in the selection of the fishing grounds and the economic valorization of the discarded species.  相似文献   
155.
Bayesian and restricted maximum likelihood(REML) approaches were used to estimate the genetic parameters in a cultured turbot Scophthalmus maximus stock. The data set consisted of harvest body weight from 2 462progenies(17 months old) from 28 families that were produced through artificial insemination using 39 parent fish. An animal model was applied to partition each weight value into a fixed effect, an additive genetic effect, and a residual effect. The average body weight of each family, which was measured at 110 days post-hatching, was considered as a covariate. For Bayesian analysis, heritability and breeding values were estimated using both the posterior mean and mode from the joint posterior conditional distribution. The results revealed that for additive genetic variance, the posterior mean estimate( δ_a~2=9 320) was highest but with the smallest residual variance,REML estimates( δ_a~2=8 088) came second and the posterior mode estimate( δ_a~2=7 849) was lowest. The corresponding three heritability estimates followed the same trend as additive genetic variance and they were all high. The Pearson correlations between each pair of the three estimates of breeding values were all high,particularly that between the posterior mean and REML estimates(0.996 9). These results reveal that the differences between Bayesian and REML methods in terms of estimation of heritability and breeding values were small. This study provides another feasible method of genetic parameter estimation in selective breeding programs of turbot.  相似文献   
156.
对青藏高原北部世界海拔最高的库木库里沙漠周围冲积扇碎屑及沙漠表层沉积物粒度特征及元素组成研究结果显示:(1)库木库里沙漠表层沉积物以细砂为主,其次为极细砂,粉砂和中砂含量较少,有少量的黏粒,不含粗砂的组分,且粒度组成差异不大。(2)沉积物MZ范围2.70~2.90 φ,σ1范围0.80~1.10 φ,SK1范围0.26~0.44,KG范围2.27~3.62。(3)粒度主要呈近似对称的单峰分布,但细颗粒一侧有较细长的尾部。(4)沙漠表层沉积物元素组成与周围山麓冲洪积碎屑沉积物较为相似,尤其是沙漠西部的阿尔喀山北麓区域。(5)沉积物源判别表明,沙漠表层沉积物来源主要为阿尔喀山北麓及祁漫塔格山南麓碎屑沉积物,其形成环境主要为河流相沉积环境及浅湖相沉积环境。  相似文献   
157.
ABSTRACT

Field data is commonly used to determine soil parameters for geotechnical analysis. Bayesian analysis allows combining field data with other information on soil parameters in a consistent manner. We show that the spatial variability of the soil properties and the associated measurements can be captured through two different modelling approaches. In the first approach, a single random variable (RV) represents the soil property within the area of interest, while the second approach models the spatial variability explicitly with a random field (RF). We apply the Bayesian concept exemplarily to the reliability assessment of a shallow foundation in a silty soil with spatially variable data. We show that the simpler RV approach is applicable in cases where the measurements do not influence the correlation structure of the soil property at the vicinity of the foundation. In other cases, it is expected to underestimate the reliability, and a RF model is required to obtain accurate results.  相似文献   
158.
This paper addresses two important issues of concern to practicing engineers and researchers alike in application of performance‐based seismic assessment (PBSA) methodology on buildings: (i) the number of ground motion records required to exercise PBSA—current practice (FEMA P‐58‐1) requires eleven or more pairs of motions for this purpose, and (ii) the time and effort associated with performing the number of nonlinear response history analyses required to exercise PBSA. We present a method for exercising of PBSA that employs classical linear modal analysis to develop a first estimate (i.e., a priori) of probability distribution of loss, followed by utilizing Bayesian statistics to update this estimate using estimates of loss obtained by utilizing a small number of nonlinear response history analyses of a detailed model of the building (i.e., posterior). The proposed technique is used to assess the distribution of monetary loss of two case studies, a 4‐story reinforced concrete moment‐resisting frame building and a 20‐story steel moment‐resisting frame building, both located in Los Angeles, for a ground motion hazard with 10% probability of exceedance in 50 years. The efficiency of the proposed PBSA method is demonstrated by showing the similarity between the distribution of monetary loss at each story of case study buildings obtained from the traditional/sophisticated PBSA methodology and the proposed PBSA method in this study. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
159.
In this paper, a Bayesian sequential sensor placement algorithm, based on the robust information entropy, is proposed for multi‐type of sensors. The presented methodology has two salient features. It is a holistic approach such that the overall performance of various types of sensors at different locations is assessed. Therefore, it provides a rational and effective strategy to design the sensor configuration, which optimizes the use of various available resources. This sequential algorithm is very efficient due to its Bayesian nature, in which prior distribution can be incorporated. Therefore, it avoids the possible unidentifiability problem encountered in a sequential process, which starts with small number of sensors. The proposed algorithm is demonstrated using a shear building and a lattice tower with consideration of up to four types of sensors. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   
160.
This contribution addresses two developing areas of sediment fingerprinting research. Specifically, how to improve the temporal resolution of source apportionment estimates whilst minimizing analytical costs and, secondly, how to consistently quantify all perceived uncertainties associated with the sediment mixing model procedure. This first matter is tackled by using direct X‐ray fluorescence spectroscopy (XRFS) and diffuse reflectance infrared Fourier transform spectroscopy (DRIFTS) analyses of suspended particulate matter (SPM) covered filter papers in conjunction with automatic water samplers. This method enables SPM geochemistry to be quickly, accurately, inexpensively and non‐destructively monitored at high‐temporal resolution throughout the progression of numerous precipitation events. We then employed a Bayesian mixing model procedure to provide full characterization of spatial geochemical variability, instrument precision and residual error to yield a realistic and coherent assessment of the uncertainties associated with source apportionment estimates. Applying these methods to SPM data from the River Wensum catchment, UK, we have been able to apportion, with uncertainty, sediment contributions from eroding arable topsoils, damaged road verges and combined subsurface channel bank and agricultural field drain sources at 60‐ and 120‐minute resolution for the duration of five precipitation events. The results presented here demonstrate how combining Bayesian mixing models with the direct spectroscopic analysis of SPM‐covered filter papers can produce high‐temporal resolution source apportionment estimates that can assist with the appropriate targeting of sediment pollution mitigation measures at a catchment level. © 2015 The Authors. Earth Surface Processes and Landforms published by John Wiley & Sons Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号