首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   540篇
  免费   79篇
  国内免费   74篇
测绘学   31篇
大气科学   44篇
地球物理   200篇
地质学   247篇
海洋学   56篇
天文学   18篇
综合类   27篇
自然地理   70篇
  2023年   6篇
  2022年   10篇
  2021年   5篇
  2020年   18篇
  2019年   22篇
  2018年   19篇
  2017年   34篇
  2016年   19篇
  2015年   34篇
  2014年   35篇
  2013年   55篇
  2012年   25篇
  2011年   31篇
  2010年   21篇
  2009年   40篇
  2008年   29篇
  2007年   41篇
  2006年   39篇
  2005年   25篇
  2004年   27篇
  2003年   18篇
  2002年   18篇
  2001年   20篇
  2000年   17篇
  1999年   12篇
  1998年   11篇
  1997年   16篇
  1996年   6篇
  1995年   7篇
  1994年   4篇
  1993年   5篇
  1992年   5篇
  1991年   1篇
  1990年   4篇
  1989年   5篇
  1988年   5篇
  1985年   1篇
  1984年   2篇
  1977年   1篇
排序方式: 共有693条查询结果,搜索用时 15 毫秒
11.
We studied the temporal behavior of the background shallow seismicity rate in 700 circular areas across inland Japan. To search for and test the significance of the possible rate changes in background seismicity, we developed an efficient computational method that applies the space–time ETAS model proposed by Ogata in 1998 to the areas. Also, we conducted Monte Carlo tests using a simulated catalog to validate the model we applied. Our first finding was that the activation anomalies were found so frequently that the constant background seismicity hypothesis may not be appropriate and/or the triggered event model with constraints on the parameters may not adequately describe the observed seismicity. However, quiescence occasionally occurs merely by chance. Another outcome of our study was that we could automatically find several anomalous background seismicity rate changes associated with the occurrence of large earthquakes. Very significant seismic activation was found before the M6.1 Mt. Iwate earthquake of 1998. Also, possible seismic quiescence was found in an area 150 km southwest of the focal region of the M7.3 Western Tottori earthquake of 2000. The seismicity rate in the area recovered after the mainshock.  相似文献   
12.
From July 1996 to August 1997 the TOR project operated 130 seismographs in North Germany, Denmark and South Sweden, with the aim of collecting signals from local, regional and teleseismic earthquakes. This data set is particularly interesting since the seismic antenna crosses the most significant geological boundary in Europe, the Tornquist Zone, which in the northern part is the border between the Baltic Shield and the younger European lithosphere. Previous studies have shown significant physical changes in the crust and upper mantle across this transition zone, including two independent teleseismic tomographic studies of the TOR data set. But these two studies disagree on the orientation of the slope of the transition. Both studies used an iterative linearized inversion method. We will in this work Preprint submitted to Elsevier Science 27 July 2005 present an inversion based on Bayesian statistics, where the solution space is examined in order to study a very large number of tomographic solutions and to examine the solution uniqueness and uncertainty. The method is applied to measurements of 3345 relative teleseismic P-phase travel times from 48 teleseismic earthquakes with good azimuthal coverage with respect to the great circle arc of the TOR array. We find the lithospheric transition to be a north east inclination of around 30° to 45° off vertical.  相似文献   
13.
Gamma ray logging is a method routinely employed by geophysicists and environmental engineers in site geology evaluations. Modelling of gamma ray data from individual boreholes assists in the local identification of major lithological changes; modelling these data from a network of boreholes assists with lithological mapping and spatial stratigraphic correlation. In this paper we employ Bayesian spatial partition models to analyse gamma ray data spatially. In particular, a spatial partition is defined via a Voronoi tessellation and the mean intensity is assumed constant in each cell of the partition. The number of vertices generating the tessellation as well as the locations of vertices are assumed unknown, and uncertainty about these quantities is described via a hierarchical prior distribution. We describe the advantages of the spatial partition modelling approach in the context of smoothing gamma ray count data and describe an implementation that may be extended to the fitting of a more general model than a constant mean within each cell of the partition. As an illustration of the methodology we consider a data set collected from a network of eight boreholes, which is part of a geophysical study to assist in mapping the lithology of a site. Gamma ray logs are linked with geological information from cores and the spatial analysis of log data assists with predicting the lithology at unsampled locations.  相似文献   
14.
Spatial probabilistic modeling of slope failure using a combined Geographic Information System (GIS), infinite-slope stability model and Monte Carlo simulation approach is proposed and applied in the landslide-prone area of Sasebo city, southern Japan. A digital elevation model (DEM) for the study area has been created at a scale of 1/2500. Calculated results of slope angle and slope aspect derived from the DEM are discussed. Through the spatial interpolation of the identified stream network, the thickness distribution of the colluvium above Tertiary strata is determined with precision. Finally, by integrating an infinite-slope stability model and Monte Carlo simulation with GIS, and applying spatial processing, a slope failure probability distribution map is obtained for the case of both low and high water levels.  相似文献   
15.
Smoothing and Change Point Detection for Gamma Ray Count Data   总被引:1,自引:0,他引:1  
Gamma ray detectors are used to measure the natural radioactivity of rocks. For a number of boreholes drilled at a site the gamma ray detector is lowered into each borehole and counts of gamma ray emissions at different depths are recorded as the instrument is gradually raised to ground level. The profile of gamma counts can be informative about the geology at each location. The raw count data are highly variable, and in this paper we describe the use of adaptive smoothing techniques and change point models in order to identify changes in the geology based on the gamma logs. We formulate all our models for the data in the framework of the class of generalized linear models, and describe computational methods for Bayesian inference and model selection for generalized linear models that improve on existing techniques. Application is made to gamma ray data from the Castelreagh Waste Management Centre which served as a hazardous waste disposal facility for the Sydney region between March 1974 and August 1998. Understanding the geological structure of this site is important for further modelling the transport of pollutants beneath the waste disposal area.  相似文献   
16.
 As part of a wider study of the nature and origins of cation order–disorder in micas, a variety of computational techniques have been used to investigate the nature of tetrahedral and octahedral ordering in phengite, K2 [6](Al3Mg)[4](Si7Al)O20(OH)4. Values of the atomic exchange interaction parameters J n used to model the energies of order–disorder were calculated. Both tetrahedral Al–Si and octahedral Al–Mg ordering were studied and hence three types of interaction parameter were necessary: for T–T, O–O and T–O interactions (where T denotes tetrahedral sites and O denotes octahedral sites). Values for the T–T and O–O interactions were taken from results on other systems, whilst we calculated new values for the T–O interactions. We have demonstrated that modelling the octahedral and tetrahedral sheets alone and independently produces different results from modelling a whole T–O–T layer, hence justifying the inclusion of the T–O interactions. Simulations of a whole T–O–T layer of phengite indicated the presence of short-range order, but no long-range order was observed. Received: 8 August 2002 / Accepted: 14 February 2003 Acknowledgements The authors are grateful to EPSRC (EJP) and the Royal Society (CIS) for financial support. Monte Carlo simulations were performed on the Mineral Physics Group's Beowulf cluster and the University of Cambridge's High Performance Computing Facility.  相似文献   
17.
The identifiability of model parameters of a steady state water quality model of the Biebrza River and the resulting variation in model results was examined by applying the Monte Carlo method which combines calibration, identifiability analysis, uncertainty analysis, and sensitivity analysis. The water quality model simulates the steady state concentration profiles of chloride, phosphate, ammonium, and nitrate as a function of distance along a river. The water quality model with the best combination of parameter values simulates the observed concentrations very well. However, the range of possible modelled concentrations obtained for other more or less equally eligible combinations of parameter values is rather wide. This range in model outcomes reflects possible errors in the model parameters. Discrepancies between the range in model outcomes and the validation data set are only caused by errors in model structure, or (measurement) errors in boundary conditions or input variables. In this sense the validation procedure is a test of model capability, where the effects of calibration errors are filtered out. It is concluded that, despite some slight deviations between model outcome and observations, the model is successful in simulating the spatial pattern of nutrient concentrations in the Biebrza River.  相似文献   
18.
A nonparametric resampling technique for generating daily weather variables at a site is presented. The method samples the original data with replacement while smoothing the empirical conditional distribution function. The technique can be thought of as a smoothed conditional Bootstrap and is equivalent to simulation from a kernel density estimate of the multivariate conditional probability density function. This improves on the classical Bootstrap technique by generating values that have not occurred exactly in the original sample and by alleviating the reproduction of fine spurious details in the data. Precipitation is generated from the nonparametric wet/dry spell model as described in Lall et al. [1995]. A vector of other variables (solar radiation, maximum temperature, minimum temperature, average dew point temperature, and average wind speed) is then simulated by conditioning on the vector of these variables on the preceding day and the precipitation amount on the day of interest. An application of the resampling scheme with 30 years of daily weather data at Salt Lake City, Utah, USA, is provided.  相似文献   
19.
River flooding is a problem of international interest. In the past few years many countries suffered from severe floods. A large part of the Netherlands is below sea level and river levels. The Dutch flood defences along the river Rhine are designed for water levels with a probability of exceedance of 1/1250 per year. These water levels are computed with a hydrodynamic model using a deterministic bed level and a deterministic design discharge. Traditionally, the safety against flooding in the Netherlands is obtained by building and reinforcing dikes. Recently, a new policy was proposed to cope with increasing design discharges in the Rhine and Meuse rivers. This policy is known as the Room for the River (RfR) policy, in which a reduction of flood levels is achieved by measures creating space for the river, such as dike replacement, side channels and floodplain lowering. As compared with dike reinforcement, these measures may have a stronger impact on flow and sediment transport fields, probably leading to stronger morphological effects. As a result of the latter the flood conveyance capacity may decrease over time. An a priori judgement of safety against flooding on the basis of an increased conveyance capacity of the river can be quite misleading. Therefore, the determination of design water levels using a fixed-bed hydrodynamic model may not be justified and the use of a mobile-bed approach may be more appropriate. This problem is addressed in this paper, using a case study of the river Waal (one of the Rhine branches in the Netherlands). The morphological response of the river Waal to a flood protection measure (floodplain lowering in combination with summer levee removal) is analysed. The effect of this measure is subject to various sources of uncertainty. Monte Carlo simulations are applied to calculate the impact of uncertainties in the river discharge on the bed levels. The impact of the “uncertain” morphological response on design flood level predictions is analysed for three phenomena, viz. the impact of the spatial morphological variation over years, the impact of the seasonal morphological variation and the impact of the morphological variability around bifurcation points. The impact of seasonal morphological variations turns out to be negligible, but the other two phenomena appear to have each an appreciable impact (order of magnitude 0.05–0.1 m) on the computed design water levels. We have to note however, that other sources of uncertainty (e.g. uncertainty in hydraulic roughness predictor), which may be of influence, are not taken into consideration. In fact, the present investigation is limited to the sensitivity of the design water levels to uncertainties in the predicted bed level.  相似文献   
20.
The floating production storage and offloading unit (FPSO) is an offshore vessel that produces and stores crude oil prior to tanker transport. Robust prediction of extreme hawser tensions during the FPSO offloading operation is an important safety concern. Excessive hawser tension may occur during certain sea conditions, posing an operational risk. In this paper, the finite element method (FEM) software ANSYS AQWA has been employed to analyze vessel response due to hydrodynamic wave loads, acting on a specific FPSO vessel under actual sea conditions.In some practical situations, it would be useful to improve the accuracy of some statistical predictions based on a certain stochastic random process, given another synchronous highly correlated stochastic process that has been measured for a longer time, than the process of interest. In this paper, the issue of improving extreme value prediction has been addressed. In other words, an efficient transfer of information is necessary between two synchronous, highly correlated stochastic processes. Two such highly correlated FPSO hawser tension processes were simulated in order to test the efficiency of the proposed technique.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号