首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   397篇
  免费   14篇
  国内免费   2篇
测绘学   6篇
大气科学   27篇
地球物理   128篇
地质学   113篇
海洋学   25篇
天文学   85篇
综合类   1篇
自然地理   28篇
  2022年   5篇
  2021年   4篇
  2020年   10篇
  2019年   7篇
  2018年   11篇
  2017年   7篇
  2016年   10篇
  2015年   11篇
  2014年   7篇
  2013年   14篇
  2012年   14篇
  2011年   6篇
  2010年   17篇
  2009年   14篇
  2008年   16篇
  2007年   18篇
  2006年   9篇
  2005年   12篇
  2004年   4篇
  2003年   12篇
  2002年   6篇
  2001年   6篇
  2000年   8篇
  1999年   5篇
  1998年   4篇
  1997年   6篇
  1996年   3篇
  1995年   8篇
  1994年   12篇
  1993年   9篇
  1992年   4篇
  1991年   7篇
  1990年   6篇
  1989年   4篇
  1987年   6篇
  1986年   7篇
  1985年   13篇
  1984年   7篇
  1983年   5篇
  1982年   9篇
  1981年   9篇
  1980年   3篇
  1979年   8篇
  1978年   6篇
  1977年   8篇
  1976年   6篇
  1975年   4篇
  1973年   6篇
  1971年   5篇
  1969年   4篇
排序方式: 共有413条查询结果,搜索用时 15 毫秒
51.
This study is focused on a passive treatment system known as the horizontal reactive treatment well (HRX Well®) installed parallel to groundwater flow, which operates on the principle of flow focusing that results from the hydraulic conductivity (K) ratio of the well and aquifer media. Passive flow and capture in the HRX Well are described by simplified equations adapted from Darcy's Law. A field pilot-scale study (PSS) and numerical simulations using a finite element method (FEM) were conducted to verify the HRX Well concept and test the validity of the HRX Well-simplified equations. The hydraulic performance results from both studies were observed to be within a close agreement to the simplified equations and their hydraulic capture width approximately five times greater than the well diameter (0.20 m). Key parameters affecting capture included the aquifer thickness, well diameter, and permeability ratio of the HRX Well treatment media and aquifer material. During pilot testing, the HRX Well captured 39% of flow while representing 0.5% of the test pit cross-sectional volume, indicating that the well captures a substantial amount of surrounding groundwater. While uncertainty in the aquifer and well properties (porosity, K, well losses), including the effects of boundary conditions, may have caused minor differences in the results, data from this study indicate that the simplified equations are valid for the conceptual design of a field study. A full-scale HRX Well was installed at Site SS003 at Vanderberg Air Force Base, California, in July/August 2018 based on outcomes from this study.  相似文献   
52.
Using a subset of the SEG Advanced Modeling Program Phase I controlled‐source electromagnetic data, we apply our standard controlled‐source electromagnetic interpretation workflows to delineate a simulated hydrocarbon reservoir. Experience learned from characterizing such a complicated model offers us an opportunity to refine our workflows to achieve better interpretation quality. The exercise proceeded in a blind test style, where the interpreting geophysicists did not know the true resistivity model until the end of the project. Rather, the interpreters were provided a traditional controlled‐source electromagnetic data package, including electric field measurements, interpreted seismic horizons, and well log data. Based on petrophysical analysis, a background resistivity model was established first. Then, the interpreters started with feasibility studies to establish the recoverability of the prospect and carefully stepped through 1D, 2.5D, and 3D inversions with seismic and well log data integrated at each stage. A high‐resistivity zone is identified with 1D analysis and further characterized with 2.5D inversions. Its lateral distribution is confirmed with a 3D anisotropic inversion. The importance of integrating all available geophysical and petrophysical data to derive more accurate interpretation is demonstrated.  相似文献   
53.

Parameterization of wave runup is of paramount importance for an assessment of coastal hazards. Parametric models employ wave (e.g., Hs and Lp) and beach (i.e., β) parameters to estimate extreme runup (e.g., R2%). Thus, recent studies have been devoted to improving such parameterizations by including additional information regarding wave forcing or beach morphology features. However, the effects of intra-wave dynamics, related to the random nature of the wave transformation process, on runup statistics have not been incorporated. This work employs a phase- and depth- resolving model, based on the Reynolds-averaged Navier-Stokes equations, to investigate different sources of variability associated with runup on planar beaches. The numerical model is validated with laboratory runup data. Subsequently, the role of both aleatory uncertainty and other known sources of runup variability (i.e., frequency spreading and bed roughness) is investigated. Model results show that aleatory uncertainty can be more important than the contributions from other sources of variability such as the bed roughness and frequency spreading. Ensemble results are employed to develop a new parametric model which uses the Hunt (J Waterw Port Coastal Ocean Eng 85:123–152, 1959) scaling parameter \(\beta \left (H_{s}L_{p}\right )^{1/2}\).

  相似文献   
54.
Following the Ixtoc I oil rig blowout in Campeche Bay we hypothesized that resulting tarballs should eventually appear in the Gulf Stream off Gerogia and that because of dynamic barriers in the innershelf little would reach nearshore areas. To test these hypotheses, surface tows to collect floating tar were taken off the coasts of Georgia and Florida in October and December, 1979.No tar was found within 40 km of the shore. All samples more than 40 km offshore contained some tar. The mean concentration was 0.82 mg m?2 with a range of 0.01–5.6 mg m?2. Closely spaced sampling showed extreme variation but trends were consistent. Perylene was the most abundant compound in the tarballs.  相似文献   
55.
A number of processes may modify the noble gas composition of silicate liquids so that the composition of noble gases observed in glassy margins of deep-sea basalts is not that of the upper mantle. Differential solubility enhances the light noble gases relative to the heavier gases; however, we demonstrate that the observed abundance pattern cannot be attributed to solubility of noble gases with atmospheric proportions. Partial melting and fractional crystallization increase the noble gas content of all species relative to mantle concentrations, but do not fractionate their relative abundances. Noble gases may be lost from an ascending magma in various ways, the most important, however, may be exclusion of gas from crystals forming at the time of solidification, which is shown to result in marked loss of gas from the basalt. Small amounts of low-temperature alteration of solidified basalt can produce dramatic changes in the noble gas abundance pattern, since the adsorption coefficients for the different noble gas favor uptake of heavy species relative to the light species. Atmospheric contamination can account for observed variations in the 40Ar/36Ar ratio of oceanic basalts. The degree of crystallinity of glassy margins of deep-sea basalts may control the helium abundance of these samples; however, the uniform 3He/4He values reported apparently reflect a relatively constant proportion of radiogenic and primordial helium in the mantle.  相似文献   
56.
The objective of in situ thermal treatment is typically to reduce the contaminant mass or average soil concentration below a specified value. Evaluation of whether the objective has been met is usually made by averaging soil concentrations from a limited number of soil samples. Results from several field sites indicate large performance uncertainty using this approach, even when the number of samples is large. We propose a method to estimate average soil concentration by fitting a log normal probability model to thermal mass recovery data. A statistical approach is presented for making termination decisions from mass recovery data, soil sample data, or both for an entire treatment volume or for subregions that explicitly considers estimation uncertainty which is coupled to a stochastic optimization algorithm to identify monitoring strategies to meet objectives with minimum expected cost. Early termination of heating in regions that reach cleanup targets sooner enables operating costs to be reduced while ensuring a high likelihood of meeting remediation objectives. Results for an example problem demonstrate that significant performance improvement and cost reductions can be achieved using this approach.  相似文献   
57.
This paper presents a general framework for predicting the residual drift of idealized SDOF systems that can be used to represent non‐degrading structures, including those with supplemental dampers. The framework first uses post‐peak oscillation analysis to predict the maximum ratio of residual displacement to the peak transient displacement in a random sample. Then, residual displacement ratios obtained from nonlinear time‐history analyses using both farfield and near‐fault‐pulse records were examined to identify trends, which were explained using the oscillation mechanics of SDOF systems. It is shown that large errors can result in existing probability models that do not capture the influence of key parameters on the residual displacement. Building on the observations that were made, a general probability distribution for the ratio of residual displacement to the peak transient displacement that more accurately reflects the physical bounds obtained from post‐peak oscillation analysis is proposed for capturing the probabilistic residual displacement response of these systems. The proposed distribution is shown to be more accurate when compared with previously proposed distributions in the literature due to its explicit account of dynamic and damping properties, which have a significant impact on the residual displacement. This study provides a rational basis for further development of a residual drift prediction tool for the performance‐based design and analysis of more complex multi‐degree‐of‐freedom systems.  相似文献   
58.
Probabilistic seismic risk assessment for spatially distributed lifelines is less straightforward than for individual structures. While procedures such as the ‘PEER framework’ have been developed for risk assessment of individual structures, these are not easily applicable to distributed lifeline systems, due to difficulties in describing ground‐motion intensity (e.g. spectral acceleration) over a region (in contrast to ground‐motion intensity at a single site, which is easily quantified using Probabilistic Seismic Hazard Analysis), and since the link between the ground‐motion intensities and lifeline performance is usually not available in closed form. As a result, Monte Carlo simulation (MCS) and its variants are well suited for characterizing ground motions and computing resulting losses to lifelines. This paper proposes a simulation‐based framework for developing a small but stochastically representative catalog of earthquake ground‐motion intensity maps that can be used for lifeline risk assessment. In this framework, Importance Sampling is used to preferentially sample ‘important’ ground‐motion intensity maps, and K‐Means Clustering is used to identify and combine redundant maps in order to obtain a small catalog. The effects of sampling and clustering are accounted for through a weighting on each remaining map, so that the resulting catalog is still a probabilistically correct representation. The feasibility of the proposed simulation framework is illustrated by using it to assess the seismic risk of a simplified model of the San Francisco Bay Area transportation network. A catalog of just 150 intensity maps is generated to represent hazard at 1038 sites from 10 regional fault segments causing earthquakes with magnitudes between five and eight. The risk estimates obtained using these maps are consistent with those obtained using conventional MCS utilizing many orders of magnitudes more ground‐motion intensity maps. Therefore, the proposed technique can be used to drastically reduce the computational expense of a simulation‐based risk assessment, without compromising the accuracy of the risk estimates. This will facilitate computationally intensive risk analysis of systems such as transportation networks. Finally, the study shows that the uncertainties in the ground‐motion intensities and the spatial correlations between ground‐motion intensities at various sites must be modeled in order to obtain unbiased estimates of lifeline risk. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   
59.
Many seismic loss problems (such as disruption of distributed infrastructure and losses to portfolios of structures) are dependent upon the regional distribution of ground‐motion intensity, rather than intensity at only a single site. Quantifying ground‐motion over a spatially‐distributed region therefore requires information on the correlation between the ground‐motion intensities at different sites during a single event. The focus of the present study is to assess the spatial correlation between ground‐motion spectral accelerations at different periods. Ground motions from eight well‐recorded earthquakes were used to study the spatial correlations. On the basis of obtained empirical correlation estimates, we propose a geostatistics‐based method to formulate a predictive model that is suitable for simulation of spectral accelerations at multiple sites and multiple periods, in the case of crustal earthquakes in active seismic regions. While the calibration of this model and investigation of its implications were somewhat complex, the model itself is very simple to use for making correlation predictions. A user only needs to evaluate a simple equation relying on three sets of coefficients provided here to compute a correlation coefficient for spectral values at two periods and at a specified separation distance. These results may then be used in evaluating the seismic risk of portfolios of structures with differing fundamental periods. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   
60.
The conditional spectrum (CS, with mean and variability) is a target response spectrum that links nonlinear dynamic analysis back to probabilistic seismic hazard analysis for ground motion selection. The CS is computed on the basis of a specified conditioning period, whereas structures under consideration may be sensitive to response spectral amplitudes at multiple periods of excitation. Questions remain regarding the appropriate choice of conditioning period when utilizing the CS as the target spectrum. This paper focuses on risk‐based assessments, which estimate the annual rate of exceeding a specified structural response amplitude. Seismic hazard analysis, ground motion selection, and nonlinear dynamic analysis are performed, using the conditional spectra with varying conditioning periods, to assess the performance of a 20‐story reinforced concrete frame structure. It is shown here that risk‐based assessments are relatively insensitive to the choice of conditioning period when the ground motions are carefully selected to ensure hazard consistency. This observed insensitivity to the conditioning period comes from the fact that, when CS‐based ground motion selection is used, the distributions of response spectra of the selected ground motions are consistent with the site ground motion hazard curves at all relevant periods; this consistency with the site hazard curves is independent of the conditioning period. The importance of an exact CS (which incorporates multiple causal earthquakes and ground motion prediction models) to achieve the appropriate spectral variability at periods away from the conditioning period is also highlighted. The findings of this paper are expected theoretically but have not been empirically demonstrated previously. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号