首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   398篇
  免费   29篇
  国内免费   5篇
测绘学   19篇
大气科学   40篇
地球物理   119篇
地质学   147篇
海洋学   50篇
天文学   26篇
综合类   2篇
自然地理   29篇
  2023年   3篇
  2022年   6篇
  2021年   13篇
  2020年   15篇
  2019年   12篇
  2018年   24篇
  2017年   21篇
  2016年   27篇
  2015年   21篇
  2014年   11篇
  2013年   27篇
  2012年   31篇
  2011年   25篇
  2010年   26篇
  2009年   16篇
  2008年   25篇
  2007年   15篇
  2006年   18篇
  2005年   8篇
  2004年   15篇
  2003年   10篇
  2002年   5篇
  2001年   2篇
  2000年   1篇
  1999年   3篇
  1998年   3篇
  1997年   2篇
  1996年   2篇
  1995年   4篇
  1994年   1篇
  1993年   2篇
  1992年   1篇
  1990年   6篇
  1989年   2篇
  1987年   4篇
  1985年   2篇
  1984年   6篇
  1983年   2篇
  1982年   3篇
  1981年   1篇
  1979年   3篇
  1977年   3篇
  1975年   3篇
  1973年   1篇
  1971年   1篇
排序方式: 共有432条查询结果,搜索用时 15 毫秒
161.
Value of information analysis is useful for helping a decision maker evaluate the benefits of acquiring or processing additional data. Such analysis is particularly beneficial in the petroleum industry, where information gathering is costly and time-consuming. Furthermore, there are often abundant opportunities for discovering creative information gathering schemes, involving the type and location of geophysical measurements. A consistent evaluation of such data requires spatial modeling that realistically captures the various aspects of the decision situation: the uncertain reservoir variables, the alternatives and the geophysical data under consideration. The computational tasks of value of information analysis can be daunting in such spatial decision situations; in this paper, a regression-based approximation approach is presented. The approach involves Monte Carlo simulation of data followed by linear regression to fit the conditional expectation expression that is needed for value of information analysis. Efficient approximations allow practical value of information analysis for the spatial decision situations that are typically encountered in petroleum reservoir evaluation. Applications are presented for seismic amplitude data and electromagnetic resistivity data, where one example includes multi-phase fluid flow simulations.  相似文献   
162.
163.
Research on seismic safety assessment has been the centre of great interest among the scientific community in recent years. Although the devastating impact of earthquakes on current society should be incentive enough to increase research, the development of more realistic mechanical behaviour models and the continuous enhancement of computation capabilities are paramount factors contributing a great deal to the increase of such interest. In this context, three research areas can be identified as currently leading to important developments: code related research, especially in Europe where new design codes are in the implementation process; risk analysis, namely concerning the definition of methodologies for safety assessment that involve the evaluation of the failure probability and could be included in future generations of codes; and experimental characterization of constitutive laws which provides support for the development and calibration of accurate and realistic numerical models for seismic analysis and for the adequate characterization of limit state capacities. The paper will present some of the current scientific research trends on these three seismic safety assessment related topics. Studies addressing the seismic safety assessment of structures, either probabilistically or based on code provisions, and that consider realistic nonlinear mechanical behaviour models will be focussed. Reference will also be made to experimental research on the seismic behaviour of structural elements, emphasizing its crucial role to support the development of numerical models simulating the effects of different seismic strengthening techniques. Finally, given the development of studies leading to new trends and perspectives for performance based earthquake engineering, a possible scenario for seismic design in the future is presented, emphasizing the key issues for its implementation.  相似文献   
164.
Calcium oxalate‐rich rock coatings are ubiquitous on limestone inside dry rock shelters and under bluff overhangs along canyon walls in southwestern Texas. Prehistoric pictographs occur in more than 250 such sites, and the ancient paints are encapsulated within the natural rock coating. Previous studies suggest lichens were the source of the oxalate; however, we report here that microbes cultured and isolated from samples of the coating produce oxalate in vitro. Twenty different bacteria species have been identified in samples from eight different sites, with Bacillus the most common genus, represented by five species. HPLC analyses of inoculated R2B medium after eight months of bacterial growth revealed the presence of oxalate ions in the solid phase of the growth medium. © 2008 Wiley Periodicals, Inc.  相似文献   
165.
Evaluating the human disturbance on karst areas is a difficult task because of the complexity of these peculiar and unique environments. The human impact on karstic geo-ecosystems is increasingly important and there is an increasing need for multidisciplinary tools to assess the environmental changes in karst areas. Many disciplines, such as biology, geomorphology, hydrology and social-economical sciences are to be considered to sufficiently evaluate the impact on these intrinsically vulnerable areas. This article gives an overview of the evolution of environmental impact on karst areas of the island Sardinia (Italy). For this particular case, the most important impacts in the past 50 years are derived from the following activities, in decreasing importance: (1) mining and quarrying; (2) deforestation, agriculture and grazing; (3) building (widespread urbanisation, isolated homes, etc.) and related infrastructures (roads, sewer systems, aqueducts, waste dumps, etc.); (4) tourism; (5) military activities. To evaluate the present environmental state of these areas the Disturbance Index for Karst environments [Van Beynen and Townsend (Environ Manage 36:101–116)] is applied in a slightly modified version. Instead of considering the indicators of environmental disturbances used in the original method, this slightly modified index evaluates the disturbances causing the deterioration of the environmental attributes. In the Sardinian case study, 27 disturbances have been evaluated, giving rise to the definition of a Disturbance Index ranging between 0 (Pristine) and 1 (highly disturbed). This Disturbance Index simplifies the original KDI method, appears to adequately measure disturbance on Mediterranean karst areas and could be applied with success to other similar regions.  相似文献   
166.
This paper shows the potential of gravity data to map a buried landfill bottom topography. To this end, a gravity inversion method is presented for estimating the landfill’s bottom depths at discrete points assuming a decrease of the density contrast with depth according to a hyperbolic law. The method’s efficiency was tested using synthetic data from simulated waste landfills, producing estimated bottom topographies very close to the true ones. The method’s potentiality has been further evaluated in applying it to the gravity data from the abandoned Thomas Farm Landfill site, Indiana, USA, whose bottom topography is known. The estimated topography showed close agreement with the known Thomas Farm Landfill’s bottom topography.  相似文献   
167.
168.
A first part gives a summary of the main conclusions stemmed from the different comparisons reported in this book. One main conclusion is: with some care in the computations, one is able to obtain frequency differences between different computations which are smaller than required by Corot challenges. This is true for high frequency modes. Small frequency differences are much less easily obtained for low frequencies around the fundamental radial mode frequency. Care here means: having the same physics, same constants of physics, same input stellar parameters, computing models and oscillations with enough self numerical consistent accuracy. The ESTA group has built reference grids of models and associated oscillation frequencies and made them available to the community. In a second part of the present paper, a study case is considered in order to show the need and the use of such reference grids. Finally some perspectives concerning the remaining tasks are suggested.  相似文献   
169.
Spectral filtering was compared with traditional mean spatial filters to assess their ability to identify and remove striped artefacts in digital elevation data. The techniques were applied to two datasets: a 100 m contour derived digital elevation model (DEM) of southern Norway and a 2 m LiDAR DSM of the Lake District, UK. Both datasets contained diagonal data artefacts that were found to propagate into subsequent terrain analysis. Spectral filtering used fast Fourier transformation (FFT) frequency data to identify these data artefacts in both datasets. These were removed from the data by applying a cut filter, prior to the inverse transform. Spectral filtering showed considerable advantages over mean spatial filters, when both the absolute and spatial distribution of elevation changes made were examined. Elevation changes from the spectral filtering were restricted to frequencies removed by the cut filter, were small in magnitude and consequently avoided any global smoothing. Spectral filtering was found to avoid the smoothing of kernel based data editing, and provided a more informative measure of data artefacts present in the FFT frequency domain. Artefacts were found to be heterogeneous through the surfaces, a result of their strong correlations with spatially autocorrelated variables: landcover and landsurface geometry. Spectral filtering performed better on the 100 m DEM, where signal and artefact were clearly distinguishable in the frequency data. Spectrally filtered digital elevation datasets were found to provide a superior and more precise representation of the landsurface and be a more appropriate dataset for any subsequent geomorphological applications. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   
170.
Marine snow from upper and mid-water (i.e., pelagic) depths on the California margin is texturally and compositionally different from that traveling in the nepheloid layer. Transmission electron microscopy shows that pelagic marine snow consists primarily of bioclasts (e.g., diatom frustules, foram tests), organic matter, and microbes. These components are entrained as discrete particles or small aggregates μm in diameter) in a loose network of exocellular, muco-polysaccharide material. Clays are infrequent but, when present, are constituents of comparatively compact organic-rich microaggregates. Microbes are abundant and appear to decrease in number with increasing water depth. In contrast, marine snow aggregates collected from just above the sea floor in the nepheloid layer are assemblages of clay particles, clay flocs, and relatively dense clay–organic-rich microaggregates in an exocellular organic matrix. Bioclasts and microorganisms occur only rarely. The prevalence of clay–organic-rich aggregates in the nepheloid layer suggests that, prior to final deposition and burial, marine snow from the pelagic zone is subject to disaggregation and recombination with terrigenous detrital material near or at the sea floor. Results have significant implications for the accumulation and burial rates of organic carbon on continental margins and the aging and bioavailability of sedimentary organic matter. Samples examined were collected offshore of northern and central California.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号