首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6032篇
  免费   567篇
  国内免费   162篇
测绘学   242篇
大气科学   633篇
地球物理   2093篇
地质学   2413篇
海洋学   337篇
天文学   535篇
综合类   189篇
自然地理   319篇
  2022年   9篇
  2021年   27篇
  2020年   10篇
  2019年   21篇
  2018年   448篇
  2017年   386篇
  2016年   269篇
  2015年   163篇
  2014年   136篇
  2013年   168篇
  2012年   669篇
  2011年   448篇
  2010年   150篇
  2009年   158篇
  2008年   155篇
  2007年   153篇
  2006年   146篇
  2005年   854篇
  2004年   905篇
  2003年   678篇
  2002年   200篇
  2001年   86篇
  2000年   59篇
  1999年   35篇
  1998年   18篇
  1997年   24篇
  1996年   20篇
  1995年   16篇
  1994年   11篇
  1992年   13篇
  1991年   19篇
  1990年   12篇
  1989年   10篇
  1988年   7篇
  1987年   16篇
  1986年   11篇
  1985年   16篇
  1984年   11篇
  1983年   8篇
  1982年   14篇
  1981年   14篇
  1980年   10篇
  1979年   11篇
  1978年   13篇
  1977年   13篇
  1976年   11篇
  1975年   20篇
  1974年   13篇
  1973年   12篇
  1972年   8篇
排序方式: 共有6761条查询结果,搜索用时 15 毫秒
891.
We propose a stochastic methodology for risk assessment of a large earthquake when a long time has elapsed from the last large seismic event. We state an approximate probability distribution for the occurrence time of the next large earthquake, by knowing that the last large seismic event occurred a long time ago. We prove that, under reasonable conditions, such a distribution is exponential with a rate depending on the asymptotic slope of the cumulative intensity function corresponding to a nonhomogeneous Poisson process. As it is not possible to obtain an empirical cumulative distribution function of the waiting time for the next large earthquake, an estimator of its cumulative distribution function based on existing data is derived. We conduct a simulation study for detecting scenario in which the proposed methodology would perform well. Finally, a real-world data analysis is carried out to illustrate its potential applications, including a homogeneity test for the times between earthquakes.  相似文献   
892.
Changing climate and precipitation patterns make the estimation of precipitation, which exhibits two-dimensional and sometimes chaotic behavior, more challenging. In recent decades, numerous data-driven methods have been developed and applied to estimate precipitation; however, these methods suffer from the use of one-dimensional approaches, lack generality, require the use of neighboring stations and have low sensitivity. This paper aims to implement the first generally applicable, highly sensitive two-dimensional data-driven model of precipitation. This model, named frequency based imputation (FBI), relies on non-continuous monthly precipitation time series data. It requires no determination of input parameters and no data preprocessing, and it provides multiple estimations (from the most to the least probable) of each missing data unit utilizing the series itself. A total of 34,330 monthly total precipitation observations from 70 stations in 21 basins within Turkey were used to assess the success of the method by removing and estimating observation series in annual increments. Comparisons with the expectation maximization and multiple linear regression models illustrate that the FBI method is superior in its estimation of monthly precipitation. This paper also provides a link to the software code for the FBI method.  相似文献   
893.
The highest seismic activity in Vietnam is observed in the northwest of the country, hence the practical significance of more accurate assessment of the earthquake hazard for the area. The worldwide experience of seismicity, in particular, the recent Tohoku mega-earthquake (March 11, 2011, M w = 9.0, Japan) shows that instrumental and historical data alone are insufficient to reliably estimate earthquake hazard. This is all the more relevant in relation to Vietnam where the period of instrumental observation is short and historical evidence is nearly lacking. In this connection we made an attempt to construct maps of earthquake hazard based on known seismicity data using the available geological and geophysical data and the method of G.I. Reisner and his associates for classification of areas by seismic potential. Since the question of what geological and geophysical parameters are to be used and with what weights remains unresolved, we developed a program package to estimate Mmax based on different options in the use of geological and geophysical data. In this paper we discuss the first results and the promise held by this program package.  相似文献   
894.
Structure‐from‐Motion (SfM) photogrammetry is now used widely to study a range of earth surface processes and landforms, and is fast becoming a core tool in fluvial geomorphology. SfM photogrammetry allows extraction of topographic information and orthophotos from aerial imagery. However, one field where it is not yet widely used is that of river restoration. The characterisation of physical habitat conditions pre‐ and post‐restoration is critical for assessing project success, and SfM can be used easily and effectively for this purpose. In this paper we outline a workflow model for the application of SfM photogrammetry to collect topographic data, develop surface models and assess geomorphic change resulting from river restoration actions. We illustrate the application of the model to a river restoration project in the NW of England, to show how SfM techniques have been used to assess whether the project is achieving its geomorphic objectives. We outline the details of each stage of the workflow, which extend from preliminary decision‐making related to the establishment of a ground control network, through fish‐eye lens camera testing and calibration, to final image analysis for the creation of facies maps, the extraction of point clouds, and the development of digital elevation models (DEMs) and channel roughness maps. The workflow enabled us to confidently identify geomorphic changes occurring in the river channel over time, as well as assess spatial variation in erosion and aggradation. Critical to the assessment of change was the high number of ground control points and the application of a minimum level of detection threshold used to assess uncertainties in the topographic models. We suggest that these two things are especially important for river restoration applications. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   
895.
Reservoirs are the most important constructions for water resources management and flood control. Great concern has been paid to the effects of reservoir on downstream area and the differences between inflows and dam site floods due to the changes of upstream flow generation and concentration conditions after reservoir’s impoundment. These differences result in inconsistency between inflow quantiles and the reservoir design criteria derived by dam site flood series, which can be a potential risk and must be quantificationally evaluated. In this study, flood frequency analysis (FFA) and flood control risk analysis (FCRA) methods are used with the long reservoir inflow series derived from a multiple inputs and single output model and a copula-based inflow estimation model. The results of FFA and FCRA are compared and the influences on reservoir flood management are also discussed. The Three Gorges Reservoir (TGR) in China is selected as a case study. Results show that the differences between the TGR inflow and dam site floods are significant which result in changes on its flood control risk rates. The mean values of TGR’s annual maximum inflow peak discharge and 3 days flood volume have increased 5.58 and 3.85% than the dam site ones, while declined by 1.82 and 1.72% for the annual maximum 7 and 15 days flood volumes. The flood control risk rates of middle and small flood events are increased while extreme flood events are declined. It is shown that the TGR can satisfy the flood control task under current hydrologic regime and the results can offer references for better management of the TGR.  相似文献   
896.
897.
Daily rainfall is a complex signal exhibiting alternation of dry and wet states, seasonal fluctuations and an irregular behavior at multiple scales that cannot be preserved by stationary stochastic simulation models. In this paper, we try to investigate some of the strategies devoted to preserve these features by comparing two recent algorithms for stochastic rainfall simulation: the first one is the modified Markov model, belonging to the family of Markov-chain based techniques, which introduces non-stationarity in the chain parameters to preserve the long-term behavior of rainfall. The second technique is direct sampling, based on multiple-point statistics, which aims at simulating a complex statistical structure by reproducing the same data patterns found in a training data set. The two techniques are compared by first simulating a synthetic daily rainfall time-series showing a highly irregular alternation of two regimes and then a real rainfall data set. This comparison allows analyzing the efficiency of different elements characterizing the two techniques, such as the application of a variable time dependence, the adaptive kernel smoothing or the use of low-frequency rainfall covariates. The results suggest, under different data availability scenarios, which of these elements are more appropriate to represent the rainfall amount probability distribution at different scales, the annual seasonality, the dry-wet temporal pattern, and the persistence of the rainfall events.  相似文献   
898.
Performing a comprehensive risk analysis is primordial to ensure a reliable and sustainable water supply. Though the general framework of risk analysis is well established, specific adaptation seems needed for systems such as water distribution networks (WDN). Understanding of vulnerabilities of WDN against deliberate contamination and consumers’ sensitivity against contaminated water use is very vital to inform decision-maker. This paper presents an innovative step-by-step methodology for developing comprehensive indicators to perform sensitivity, vulnerability and criticality analyses in case of absence of early warning system (EWS). The assessment and the aggregation of these indicators with specific fuzzy operators allow identifying the most critical points in a WDN. Intentional intrusion of contaminants at these points can potentially harm both the consumers as well as water infrastructure. The implementation of the developed methodology has been demonstrated through a case study of a French WDN unequipped with sensors.  相似文献   
899.
Successful modeling of stochastic hydro-environmental processes widely relies on quantity and quality of accessible data and noisy data might effect on the functioning of the modeling. On the other hand in training phase of any Artificial Intelligence based model, each training data set is usually a limited sample of possible patterns of the process and hence, might not show the behavior of whole population. Accordingly in the present article first, wavelet-based denoising method was used in order to smooth hydrological time series and then small normally distributed noises with the mean of zero and various standard deviations were generated and added to the smoothed time series to form different denoised-jittered training data sets, for Artificial Neural Network (ANN) and Adaptive Neuro-Fuzzy Inference System (ANFIS) modeling of daily and multi-step-ahead rainfall–runoff process of the Milledgeville station of the Oconee River and the Pole Saheb station of the Jighatu River watersheds, respectively located in USA and Iran. The proposed hybrid data pre-processing approach in the present study is used for the first time in modeling of time series and especially in modeling of hydrological processes. Furthermore, the impacts of denoising (smoothing) and noise injection (jittering) have been simultaneously investigated neither in hydrology nor in any other engineering fields. To evaluate the modeling performance, the outcomes were compared with the results of multi linear regression and Auto Regressive Integrated Moving Average models. Comparing the achieved results via the trained ANN and ANFIS models using denoised-jittered data showed that the proposed data pre-processing approach which serves both denoising and jittering techniques could improve performance of the ANN and ANFIS based single-step-ahead rainfall–runoff modeling of the Milledgeville station up to 14 and 12% and of the Pole Saheb station up to 22 and 16% in the verification phase. Also the results of multi-step-ahead modeling using the proposed data pre-processing approach showed improvement of modeling for both watersheds.  相似文献   
900.
A minimum 1-D seismic velocity model for routine seismic event location purposes was determined for the area of the western Barents Sea, using a modified version of the VELEST code. The resulting model, BARENTS16, and corresponding station corrections were produced using data from stations at regional distances, the vast majority located in the periphery of the recorded seismic activity, due to the unfavorable land–sea distribution. Recorded seismicity is approached through the listings of a joint bulletin, resulting from the merging of several international and regional bulletins for the region, as well as additional parametric data from temporary deployments. We discuss the challenges posed by this extreme network-seismicity geometry in terms of velocity estimation resolution and result stability. Although the conditions do not facilitate the estimation of meaningful station corrections at the farthermost stations, and even well-resolved corrections do not have a convincing contribution, we show that the process can still converge to a stable velocity average for the crust and upper mantle, in good agreement with a priori information about the regional structure and geology, which reduces adequately errors in event location estimates.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号