首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   490篇
  免费   56篇
  国内免费   35篇
测绘学   8篇
大气科学   23篇
地球物理   155篇
地质学   285篇
海洋学   30篇
天文学   21篇
综合类   31篇
自然地理   28篇
  2023年   8篇
  2022年   21篇
  2021年   49篇
  2020年   38篇
  2019年   30篇
  2018年   80篇
  2017年   49篇
  2016年   66篇
  2015年   39篇
  2014年   34篇
  2013年   46篇
  2012年   13篇
  2011年   26篇
  2010年   6篇
  2009年   12篇
  2008年   10篇
  2007年   4篇
  2006年   4篇
  2005年   5篇
  2004年   6篇
  2003年   4篇
  2002年   5篇
  2001年   6篇
  2000年   4篇
  1999年   4篇
  1998年   3篇
  1996年   2篇
  1995年   3篇
  1991年   1篇
  1986年   1篇
  1981年   1篇
  1979年   1篇
排序方式: 共有581条查询结果,搜索用时 31 毫秒
71.
Accurate identification of vulnerability areas is critical for groundwater resources protection and management. The present study employed the modified DRASTIC model to assess the groundwater vulnerability of Jianghan Plain, a major farming area in central China. DRASTICL model was developed by incorporating the land use factor to the original model. The ratings and weightings of the selected parameters were optimized by analytic hierarchy process (AHP) method and genetic algorithms (GAs) method, respectively. A combined AHP–GAs method was proposed to further develop this methodology. The unity-based normalization process was employed to categorize the vulnerability maps into four types, such as very high (>0.75), high (0.5–0.75), low (0.25–0.5), and very low (<0.25). The accuracy of vulnerability mapping was validated by Pearson’s correlation coefficient between vulnerability index and the nitrate concentration in groundwater and analysis of variance F statistic. The results revealed that the modified DRASTIC model had a large improvement over the conventional model. The correlation coefficient increased significantly from 41.07 to 75.31% after modification. Sensitivity analysis indicated that the depth to groundwater with 39.28% of mean effective weight was the most critical factor affecting the groundwater vulnerability. The developed vulnerability model proposed in this study could provide important objective information for groundwater and environmental management at local level and innovation for international researchers.  相似文献   
72.
Integrating migration velocity analysis and full waveform inversion can help reduce the high non‐linearity of the classic full waveform inversion objective function. The combination of inverting for the long and short wavelength components of the velocity model using a dual objective function that is sensitive to both components is still very expensive and have produced mixed results. We develop an approach that includes both components integrated to complement each other. We specifically utilize the image to generate reflections in our synthetic data only when the velocity model is not capable of producing such reflections. As a result, we get the migration velocity analysis working when we need it, and we mitigate its influence when the velocity model produces accurate reflections (possibly first for the low frequencies). This is achieved using a novel objective function that includes both objectives. Applications to a layered model and the Marmousi model demonstrate the main features of the approach.  相似文献   
73.
74.
Wavefield computations using the ellipsoidally anisotropic extrapolation operator offer significant cost reduction compared to that for the orthorhombic case, especially when the symmetry planes are tilted and/or rotated. However, ellipsoidal anisotropy does not provide accurate wavefield representation or imaging for media of orthorhombic symmetry. Therefore, we propose the use of ‘effective ellipsoidally anisotropic’ models that correctly capture the kinematic behaviour of wavefields for tilted orthorhombic (TOR) media. We compute effective velocities for the ellipsoidally anisotropic medium using kinematic high-frequency representation of the TOR wavefield, obtained by solving the TOR eikonal equation. The effective model allows us to use the cheaper ellipsoidally anisotropic wave extrapolation operators. Although the effective models are obtained by kinematic matching using high-frequency asymptotic, the resulting wavefield contains most of the critical wavefield components, including frequency dependency and caustics, if present, with reasonable accuracy. The proposed methodology offers a much better cost versus accuracy trade-off for wavefield computations in TOR media, particularly for media of low to moderate anisotropic strength. Furthermore, the computed wavefield solution is free from shear-wave artefacts as opposed to the conventional finite-difference based TOR wave extrapolation scheme. We demonstrate applicability and usefulness of our formulation through numerical tests on synthetic TOR models.  相似文献   
75.
Small‐scale hyporheic zone (HZ) models often use a spatial periodic boundary (SPB) pair to simulate an infinite repetition of bedforms. SPB's are common features of commercially available multiphysics modeling packages. MODFLOW's lack of this boundary type has precluded it from being effectively utilized in this area of HZ research. We present a method to implement the SPB in MODFLOW by development of the appropriate block‐centered finite‐difference expressions. The implementation is analogous to MODFLOW's general head boundary package. The difference is that the terms on the right hand side of the solution equations must be updated with each iteration. Consequently, models that implement the SPB converge best with solvers that perform both inner and outer iterations. The correct functioning of the SPB condition in MODFLOW is verified by two examples. This boundary condition allows users to build HZ‐bedform models in MODFLOW, facilitating further research using related codes such as MT3DMS and PHT3D.  相似文献   
76.
Seismic hazard analysis requires knowledge of the recurrence rates of large magnitude earthquakes that drive the hazard at low probabilities of interest for seismic design. Earthquake recurrence is usually determined through studies of the historic earthquake catalogue for a given region. Reliable historic catalogues generally span time periods of 100–200 years in North America, while large magnitude events (M?≥?7) have recurrence rates on the order of hundreds or thousands of years in many areas, resulting in large uncertainty in recurrence rates for large events. Using Monte Carlo techniques and assuming typical recurrence parameters, we simulate earthquake catalogues that span long periods of time. We then split these catalogues into smaller catalogues spanning 100–200 years that mimic the length of historic catalogues. For each of these simulated “historic” catalogues, a recurrence rate for large magnitude events is determined. By comparing recurrence rates from one historic-length catalogue to another, we quantify the uncertainty associated with determining recurrence rates from short historic catalogues. The use of simulations to explore the uncertainty (rather than analytical solutions) allows us flexibility to consider issues such as the relative contributions of aleatory versus epistemic uncertainty, and the influence of fitting method, as well as lending insight into extreme-event statistics. The uncertainty in recurrence rates of large (M?>?7) events is about a factor of two in regions of high seismicity, due to the shortness of historic catalogues. This uncertainty increases greatly with decreasing seismic activity. Uncertainty is dependent on the length of the catalogue as well as the fitting method used (least squares vs. maximum likelihood). Examination of 90th percentile recurrence rates reveals that epistemic uncertainty in the true parameters may cause recurrence rates determined from historic catalogues to be uncertain by a factor greater than 50.  相似文献   
77.
78.
79.
80.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号