全文获取类型
收费全文 | 5322篇 |
免费 | 542篇 |
国内免费 | 157篇 |
专业分类
测绘学 | 238篇 |
大气科学 | 600篇 |
地球物理 | 1946篇 |
地质学 | 2130篇 |
海洋学 | 298篇 |
天文学 | 368篇 |
综合类 | 187篇 |
自然地理 | 254篇 |
出版年
2022年 | 6篇 |
2021年 | 20篇 |
2020年 | 5篇 |
2019年 | 9篇 |
2018年 | 433篇 |
2017年 | 376篇 |
2016年 | 250篇 |
2015年 | 149篇 |
2014年 | 115篇 |
2013年 | 118篇 |
2012年 | 648篇 |
2011年 | 424篇 |
2010年 | 120篇 |
2009年 | 135篇 |
2008年 | 126篇 |
2007年 | 116篇 |
2006年 | 132篇 |
2005年 | 836篇 |
2004年 | 875篇 |
2003年 | 654篇 |
2002年 | 182篇 |
2001年 | 70篇 |
2000年 | 44篇 |
1999年 | 16篇 |
1998年 | 8篇 |
1997年 | 22篇 |
1996年 | 13篇 |
1991年 | 11篇 |
1990年 | 12篇 |
1989年 | 6篇 |
1987年 | 5篇 |
1983年 | 3篇 |
1980年 | 7篇 |
1979年 | 3篇 |
1978年 | 4篇 |
1976年 | 4篇 |
1975年 | 7篇 |
1974年 | 4篇 |
1973年 | 3篇 |
1969年 | 2篇 |
1968年 | 2篇 |
1965年 | 3篇 |
1963年 | 2篇 |
1961年 | 2篇 |
1959年 | 2篇 |
1955年 | 2篇 |
1954年 | 2篇 |
1951年 | 2篇 |
1948年 | 2篇 |
1934年 | 2篇 |
排序方式: 共有6021条查询结果,搜索用时 0 毫秒
101.
Chris?E.?GreggEmail author Bruce?F.?Houghton Douglas?Paton Donald?A.?Swanson David?M.?Johnston 《Bulletin of Volcanology》2004,66(6):531-540
Lava flows from Mauna Loa and Huallai volcanoes are a major volcanic hazard that could impact the western portion of the island of Hawaii (e.g., Kona). The most recent eruptions of these two volcanoes to affect Kona occurred in a.d. 1950 and ca. 1800, respectively. In contrast, in eastern Hawaii, eruptions of neighboring Klauea volcano have occurred frequently since 1955, and therefore have been the focus for hazard mitigation. Official preparedness and response measures are therefore modeled on typical eruptions of Klauea.The combinations of short-lived precursory activity (e.g., volcanic tremor) at Mauna Loa, the potential for fast-moving lava flows, and the proximity of Kona communities to potential vents represent significant emergency management concerns in Kona. Less is known about past eruptions of Huallai, but similar concerns exist. Future lava flows present an increased threat to personal safety because of the short times that may be available for responding.Mitigation must address not only the specific characteristics of volcanic hazards in Kona, but also the manner in which the hazards relate to the communities likely to be affected. This paper describes the first steps in developing effective mitigation plans: measuring the current state of peoples knowledge of eruption parameters and the implications for their safety. We present results of a questionnaire survey administered to 462 high school students and adults in Kona. The rationale for this study was the long lapsed time since the last Kona eruption, and the high population growth and expansion of infrastructure over this time interval. Anticipated future growth in social and economic infrastructure in this area provides additional justification for this work.The residents of Kona have received little or no specific information about how to react to future volcanic eruptions or warnings, and short-term preparedness levels are low. Respondents appear uncertain about how to respond to threatening lava flows and overestimate the minimum time available to react, suggesting that personal risk levels are unnecessarily high. A successful volcanic warning plan in Kona must be tailored to meet the unique situation there. 相似文献
102.
103.
John?G.?ManchukEmail author Ryan?M.?Barnett Clayton?V.?Deutsch 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(10):2585-2605
A challenge when working with multivariate data in a geostatistical context is that the data are rarely Gaussian. Multivariate distributions may include nonlinear features, clustering, long tails, functional boundaries, spikes, and heteroskedasticity. Multivariate transformations account for such features so that they are reproduced in geostatistical models. Projection pursuit as developed for high dimensional data exploration can also be used to transform a multivariate distribution into a multivariate Gaussian distribution with an identity covariance matrix. Its application within a geostatistical modeling context is called the projection pursuit multivariate transform (PPMT). An approach to incorporate exhaustive secondary variables in the PPMT is introduced. With this approach the PPMT can incorporate any number of secondary variables with any number of primary variables. A necessary alteration to the approach to make this numerically practical was the implementation of a continuous probability estimator that relies on Bernstein polynomials for the transformation that takes place in the projections. Stopping criteria were updated to incorporate a bootstrap t test that compares data sampled from a multivariate Gaussian distribution with the data undergoing transformation. 相似文献
104.
Gyan?Chhipi-ShresthaEmail author Julie?Mori Kasun?Hewage Rehan?Sadiq 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(2):417-430
Several risk factors associated with the increased likelihood of healthcare-associated Clostridium difficile infection (CDI) have been identified in the literature. These risk factors are mainly related to age, previous CDI, antimicrobial exposure, and prior hospitalization. No model is available in the published literature that can be used to predict the CDI incidence using healthcare administration data. However, the administrative data can be imprecise and may challenge the building of classical statistical models. Fuzzy set theory can deal with the imprecision inherent in such data. This research aimed to develop a model based on deterministic and fuzzy mathematical techniques for the prediction of hospital-associated CDI by using the explanatory variables controllable by hospitals and health authority administration. Retrospective data on CDI incidence and other administrative data obtained from 22 hospitals within a regional health authority in British Columbia were used to develop a decision tree (deterministic technique based) and a fuzzy synthetic evaluation model (fuzzy technique based). The decision tree model had a higher prediction accuracy than that of the fuzzy based model. However, among the common results predicted by two models, 72 % were correct. Therefore, this relationship was used to combine their results to increase the precision and the strength of evidence of the prediction. These models were further used to develop an Excel-based tool called C. difficile Infection Incidence Prediction in Hospitals (CDIIPH). The tool can be utilized by health authorities and hospitals to predict the magnitude of CDI incidence in the following quarter. 相似文献
105.
Pierre?MasselotEmail author Fateh?Chebana Taha?B.M.J.?Ouarda 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(2):509-522
Regional frequency analysis is an important tool to properly estimate hydrological characteristics at ungauged or partially gauged sites in order to prevent hydrological disasters. The delineation of homogeneous groups of sites is an important first step in order to transfer information and obtain accurate quantile estimates at the target site. The Hosking–Wallis homogeneity test is usually used to test the homogeneity of the selected sites. Despite its usefulness and good power, it presents some drawbacks including the subjective choice of a parametric distribution for the data and a poorly justified rejection threshold. The present paper addresses these drawbacks by integrating nonparametric procedures in the L-moment homogeneity test. To assess the rejection threshold, three resampling methods (permutation, bootstrap and Pólya resampling) are considered. Results indicate that permutation and bootstrap methods perform better than the parametric Hosking–Wallis test in terms of power as well as in time and procedure simplicity. A real-world case study shows that the nonparametric tests agree with the HW test concerning the homogeneity of the volume and the bivariate case while they disagree for the peak case, but that the assumptions of the HW test are not well respected. 相似文献
106.
Seong-Hee?Kim Mustafa?M.?Aral Yongsoon?Eun Jisu?J.?Park Chuljin?ParkEmail authorView authors OrcID profile 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(3):743-756
This paper studies the impact of sensor measurement error on designing a water quality monitoring network for a river system, and shows that robust sensor locations can be obtained when an optimization algorithm is combined with a statistical process control (SPC) method. Specifically, we develop a possible probabilistic model of sensor measurement error and the measurement error model is embedded into a simulation model of a river system. An optimization algorithm is used to find the optimal sensor locations that minimize the expected time until a spill detection in the presence of a constraint on the probability of detecting a spill. The experimental results show that the optimal sensor locations are highly sensitive to the variability of measurement error and false alarm rates are often unacceptably high. An SPC method is useful in finding thresholds that guarantee a false alarm rate no more than a pre-specified target level, and an optimization algorithm combined with the thresholds finds a robust sensor network. 相似文献
107.
Dongkyun?KimEmail author Huidae?Cho Christian?Onof Minha?Choi 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(4):1023-1043
We present a web application named Let-It-Rain that is able to generate a 1-h temporal resolution synthetic rainfall time series using the modified Bartlett–Lewis rectangular pulse (MBLRP) model, a type of Poisson stochastic rainfall generator. Let-It-Rain, which can be accessed through the web address http://www.LetItRain.info, adopts a web-based framework combining ArcGIS Server from server side for parameter value dissemination and JavaScript from client side to implement the MBLRP model. This enables any desktop and mobile end users with internet access and web browser to obtain the synthetic rainfall time series at any given location at which the parameter regionalization work has been completed (currently the contiguous United States and Republic of Korea) with only a few mouse clicks. Let-It-Rain shows satisfactory performance in its ability to reproduce observed rainfall mean, variance, auto-correlation, and probability of zero rainfall at hourly through daily accumulation levels. It also shows a reasonably good performance in reproducing watershed runoff depth and peak flow. We expect that Let-It-Rain can stimulate the uncertainty analysis of hydrologic variables across the world. 相似文献
108.
Mario?GómezEmail authorView authors OrcID profile M.?Concepción Ausín M.?Carmen Domínguez 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(5):1107-1121
Modelling glacier discharge is an important issue in hydrology and climate research. Glaciers represent a fundamental water resource when melting of ice and snow contributes to runoff. Glaciers are also studied as natural global warming sensors. GLACKMA association has implemented one of their Pilot Experimental Catchment areas at the King George Island in the Antarctica which records values of the liquid discharge from Collins glacier. In this paper, we propose the use of time-varying copula models for analyzing the relationship between air temperature and glacier discharge, which is clearly non constant and non linear through time. A seasonal copula model is defined where both the marginal and copula parameters vary periodically along time following a seasonal dynamic. Full Bayesian inference is performed such that the marginal and copula parameters are estimated in a one single step, in contrast with the usual two-step approach. Bayesian prediction and model selection is also carried out for the proposed model such that Bayesian credible intervals can be obtained for the conditional glacier discharge given a value of the temperature at any given time point. The proposed methodology is illustrated using the GLACKMA real data where there is, in addition, a hydrological year of missing discharge data which were not possible to measure accurately due to problems in the sounding. 相似文献
109.
Qian?Zhang Xiujuan?Liang Zhang?FangEmail author Changlai?Xiao 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(7):1697-1707
Precipitation is an important part of the hydrologic cycle, and its complexity is closely related to surface runoff and changing groundwater dynamics, which in turn influences the accuracy of precipitation forecasts. In this study, we used the Lempel–Ziv algorithm (LZA) and a multi-scaling approach to assess precipitation complexity for 1958–2011 by analyzing time series data from 28 gauging stations located throughout Jilin province, China. The spatial distribution of normalized precipitation complexity was measured by LZA, a symbolic dynamics algorithm, and by a multi-scaling approach, which is described by fractals. In addition, the advantages and limitations of these two methods were investigated. The results indicate that both methods are applicable and consistent for calculating precipitation complexity, and that the degree of relief is a primary factor controlling precipitation complexity in the mountainous area; in the plain terrain, however, the prominent influencing factor is climate. 相似文献
110.
Raúl?Fierro Víctor?LeivaEmail author 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(9):2327-2336
We propose a stochastic methodology for risk assessment of a large earthquake when a long time has elapsed from the last large seismic event. We state an approximate probability distribution for the occurrence time of the next large earthquake, by knowing that the last large seismic event occurred a long time ago. We prove that, under reasonable conditions, such a distribution is exponential with a rate depending on the asymptotic slope of the cumulative intensity function corresponding to a nonhomogeneous Poisson process. As it is not possible to obtain an empirical cumulative distribution function of the waiting time for the next large earthquake, an estimator of its cumulative distribution function based on existing data is derived. We conduct a simulation study for detecting scenario in which the proposed methodology would perform well. Finally, a real-world data analysis is carried out to illustrate its potential applications, including a homogeneity test for the times between earthquakes. 相似文献