首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   231篇
  免费   33篇
  国内免费   18篇
测绘学   19篇
大气科学   4篇
地球物理   101篇
地质学   95篇
海洋学   14篇
天文学   7篇
综合类   19篇
自然地理   23篇
  2022年   3篇
  2021年   4篇
  2020年   2篇
  2019年   6篇
  2018年   2篇
  2017年   9篇
  2016年   12篇
  2015年   9篇
  2014年   14篇
  2013年   23篇
  2012年   15篇
  2011年   8篇
  2010年   10篇
  2009年   11篇
  2008年   11篇
  2007年   7篇
  2006年   8篇
  2005年   14篇
  2004年   12篇
  2003年   13篇
  2002年   19篇
  2001年   8篇
  2000年   7篇
  1999年   4篇
  1998年   12篇
  1997年   3篇
  1996年   4篇
  1995年   3篇
  1994年   1篇
  1993年   6篇
  1992年   5篇
  1991年   4篇
  1990年   1篇
  1989年   1篇
  1988年   1篇
  1987年   1篇
  1986年   3篇
  1984年   1篇
  1983年   3篇
  1982年   1篇
  1980年   1篇
排序方式: 共有282条查询结果,搜索用时 171 毫秒
61.
针对泊松表面重建算法在点云数据的网格化中仍不能有效满足"细节保持与噪声平滑"的平衡问题,该文提出了一种基于高斯滤波的改进泊松算法。通过将高斯滤波引入到点云数据等值面的向量场估计中,一方面实现了对点云拓扑结构的更准确估计以及对点云噪声的有效平滑;另一方面通过调节高斯滤波中的标准差参数,实现了对点云模型网格化的细节保持与噪声平滑的细微控制。以福州大学张孤梅雕像为实验对象,图像三维重建技术获得的点云数据作为数据源,利用改进的泊松算法进行点云网格化。结果表明,改进的泊松算法提高了网格模型的精确性与完整性,且在视觉上更好地逼近真实模型的细节,验证了改进算法的有效性。  相似文献   
62.
The goodness of fit of the negative binomial and the Poisson distributions to partial duration series of runoff events is tested. The data have been recorded by eight hydrometric stations located on ephemeral rivers in Isreal. For each station, a number of threshold discharges are considered, by that series of nested subsamples are formed. Owing to size limitations, a Chi-square test is conducted on samples associated with low to moderate threshold discharges. Positive results, at a 5% significance level, are obtained in 30 out of the 53 tests of the Poisson distribution, and in 22 out of the 28 tests of the negative binomial distribution. The fit of the Poisson distribution to samples of conventionally recommended sizes (of 2 to 3 events per year) is found positive for five rivers and negative for the three other rivers The fit of the negative binomial distribution to these samples is found positive for six rivers, inconclusive for one river and short of data for the eighth river. Mixed results are obtained as the threshold level is raised. Therefore, no direct extrapolation is possible to samples associated with high thresholds.An indirect extrapolation is drawn through a comparison of the actual properties of the samples with those expected under a perfect fit of the distribution functions. Ranges of such properties are defined with respect to the properties of the tested samples and to the test results. The actual properties of nine of the eleven samples associated with high thresholds (i.e. mean number of events <-0.1year –1) are found within these ranges. This provides a hint for a probable good fit of either distribution, and particularly the negative binomial, to the occurrence frequency of high events.  相似文献   
63.
A devastating flood occurred in southern Alberta on June 19, 2013, from greater than normal snowfalls in the Rocky Mountains and excess precipitation during the early spring that left soils saturated and unable to absorb any additional precipitation. This flood was Canada's most costly natural disaster, with five to six billion Canadian dollars in damages. The first objective of this study was to determine if the flood caused an increase in private drinking water well contamination in the Calgary Health Zone by comparing contamination rates to previous years. The second objective was to determine which environmental factors were associated with contamination during this flood event. Test results of total coliforms (TC) and Escherichia coli (EC) of private water wells were used to determine contamination. A geographically weighted Poisson regression analysis suggested that TC contamination was not associated with this flood. The EC contamination is positively associated with floodways, flood fringe, farms, and negatively associated with intermittent water (sloughs). These results suggest that for the 2013 flood, individual well characteristics are more important than surrounding geographic features. Thus, it is recommended that homeowners who live in a high-risk area ensure their wells are properly maintained to reduce risk of water well contamination.  相似文献   
64.
中国东北地区自晚古生代晚期以来,受古亚洲洋、蒙古—鄂霍茨克洋、太平洋构造域的叠加作用,壳幔结构极为复杂。本文收集中国东北地区国家地震台网接收的100 980个P波和91 030个S波到时数据,采用地震走时层析成像方法获得了该地区地壳P波和S波速度结构,进而获得了泊松比结构,用以探讨复杂的地壳结构。成像结果显示:中国东北地区地壳地震波速度结构呈明显的横向不均匀性,不同构造单元和构造单元内部都存在不同程度的不均匀性。松辽盆地整体上浅层地壳以低速异常为主,尤其是S波速度,但部分区域分辨率较低,中下地壳存在较大范围高速异常,推测与太平洋俯冲、后撤导致的岩石圈拆沉和热物质上涌等动力学过程有关;北部的大兴安岭重力梯级带和长白山一线主要表现为低速异常,表明具有大范围的岩浆作用,广泛岩浆作用为固体矿产资源的形成提供热源或物源;长白山、五大连池等近代活动的火山下方部分地壳区域均表现了较强低地震波速度和高泊松比异常结构特征,表明仍存在活动的可能。  相似文献   
65.
董瑞树  谷德贵 《地震地质》1993,15(3):239-246
采用双泊松模式更新过程研究华北的北部地区各级地震的两个发震速率,并对各速率进行检验,最后利用概率分配模式,求出各潜在震源区未来50a的发震概率  相似文献   
66.
67.
68.
Spatial-temporal rainfall modelling for flood risk estimation   总被引:4,自引:6,他引:4  
Some recent developments in the stochastic modelling of single site and spatial rainfall are summarised. Alternative single site models based on Poisson cluster processes are introduced, fitting methods are discussed, and performance is compared for representative UK hourly data. The representation of sub-hourly rainfall is discussed, and results from a temporal disaggregation scheme are presented. Extension of the Poisson process methods to spatial-temporal rainfall, using radar data, is reported. Current methods assume spatial and temporal stationarity; work in progress seeks to relax these restrictions. Unlike radar data, long sequences of daily raingauge data are commonly available, and the use of generalized linear models (GLMs) (which can represent both temporal and spatial non-stationarity) to represent the spatial structure of daily rainfall based on raingauge data is illustrated for a network in the North of England. For flood simulation, disaggregation of daily rainfall is required. A relatively simple methodology is described, in which a single site Poisson process model provides hourly sequences, conditioned on the observed or GLM-simulated daily data. As a first step, complete spatial dependence is assumed. Results from the River Lee catchment, near London, are promising. A relatively comprehensive set of methodologies is thus provided for hydrological application.  相似文献   
69.
Data collected along transects are becoming more common in environmental studies as indirect measurement devices, such as geophysical sensors, that can be attached to mobile platforms become more prevalent. Because exhaustive sampling is not always possible under constraints of time and costs, geostatistical interpolation techniques are used to estimate unknown values at unsampled locations from transect data. It is known that outlying observations can receive significantly greater ordinary kriging weights than centrally located observations when the data are contiguously aligned along a transect within a finite search window. Deutsch (1994) proposed a kriging algorithm, finite domain kriging, that uses a redundancy measure in place of the covariance function in the data-to-data kriging matrix to address the problem of overweighting the outlying observations. This paper compares the performances of two kriging techniques, ordinary kriging (OK) and finite domain kriging (FDK), on examining unexploded ordnance (UXO) densities by comparing prediction errors at unsampled locations. The impact of sampling design on object count prediction is also investigated using data collected from transects and at random locations. The Poisson process is used to model the spatial distribution of UXO for three 5000 × 5000 m fields; one of which does not have any ordnance target (homogeneous field), while the other two sites have an ordnance target in the center of the site (isotropic and anisotropic fields). In general, for a given sampling transects width, the differences between OK and FDK in terms of the mean error and the mean square error are not significant regardless of the sampled area and the choice of the field. When 20% or more of the site is sampled, the estimation of object counts is unbiased on average for all three fields regardless of the choice of the transect width and the choice of the kriging algorithm. However, for non-homogeneous fields (isotropic and anisotropic fields), the mean error fluctuates considerably when a small number of transects are sampled. The difference between the transect sampling and the random sampling in terms of prediction errors becomes almost negligible if more than 20% of the site is sampled. Overall, FDK is no better than OK in terms of the prediction performances when the transect sampling procedure is used.  相似文献   
70.
In this paper,the temporal and spatial variation process of seismicity in areas from Lancang to Tengchong before the 1988 Lancang-Gengma earthquakes(M=7.6,7.2),January 1980 to October 1988,is studied in detail according to the theory that the whole process of earthquake sequence in the time stage of anomalous seismicity before a strong event may be considered as the non-homogeneous Poisson process.The results demonstrate that(1)from April 1985 to April 1988,there existed an obvious difference of seismicity in spatial distribution in the whole region; to the north of Lancang,there occurred two seismic quiescent belts:one is 210 km long for M≥3.5 events and anotheris 160 km long for M≥3.0 events; therefore,this may be classified into four sub-regions from south to north,that is,the south region,the mid-region,the mid'-region,and the north region.(2)Before the mainshocks,there existed anomalous seismic quiescence for as Song as 42 months in the mid-region(M≥3.5)and 32.5 months in the mid'-region(M≥3.0)  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号