首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   305篇
  免费   12篇
  国内免费   21篇
测绘学   6篇
大气科学   30篇
地球物理   85篇
地质学   124篇
海洋学   20篇
天文学   33篇
自然地理   40篇
  2024年   3篇
  2022年   4篇
  2021年   12篇
  2020年   16篇
  2019年   15篇
  2018年   13篇
  2017年   7篇
  2016年   11篇
  2015年   17篇
  2014年   10篇
  2013年   25篇
  2012年   19篇
  2011年   19篇
  2010年   15篇
  2009年   16篇
  2008年   21篇
  2007年   19篇
  2006年   12篇
  2005年   11篇
  2004年   11篇
  2003年   7篇
  2002年   12篇
  2001年   5篇
  2000年   6篇
  1999年   3篇
  1998年   5篇
  1997年   2篇
  1996年   5篇
  1995年   2篇
  1994年   2篇
  1992年   3篇
  1991年   1篇
  1990年   2篇
  1989年   1篇
  1981年   1篇
  1980年   1篇
  1979年   2篇
  1973年   1篇
  1972年   1篇
排序方式: 共有338条查询结果,搜索用时 15 毫秒
61.
The HadISST1 sea surface temperature data set is examined for two contrasting areas: the Chagos Archipelago, central Indian Ocean which has a small (approximately 3 degrees C) annual temperature fluctuation, and Abu Dhabi in the southern Arabian Gulf whose annual air temperature fluctuation of approximately 24 degrees C is the largest known for coral reef habitats. The HadISST1 data are shown to match air temperature records closely, both in terms of annual moving averages and residual analysis. Temperatures in 1998 caused massive mortality of corals in the Indian Ocean: sea surface temperature (SST) values causing this were 33.8 degrees C in the Arabian Gulf at a time when average daily air temperature was over 40 degrees C, while in Chagos the SST lethal to corals was 29.8-29.9 degrees C, when air temperatures peaked at about 31 degrees C. The HadISST1 record was searched back to 1870 for previous abnormal peaks: one of 29.7 degrees C was found for Chagos SST in 1972, though this did not cause coral mortality. Analysis of 12-month running means of the residuals from the annual cycle show that, between 1870 and 1999, the largest SST deviations occurred between October 1997 and May 1998 in Chagos and between August 1998 and July 1999 near Abu Dhabi. The event of 1998-1999 was the largest in these regions for at least 130 years. SSTs have risen over the last three decades at rates of about 0.22 degrees or 0.23 degrees per decade in both locations.  相似文献   
62.
The Northern Till is a thick (>65 m) deformation till underlying some 7500 km2 of Southern Ontario, Canada including the Peterborough Drumlin Field. It was deposited below the Lake Ontario ice stream of the Laurentide Ice Sheet. The till rests on glaciotectonized aquifer sediments and consists of multiple beds of till up to 6 m thick. These are separated by boulder lags, sometimes in the form of striated pavements, with thin (<30 cm) interbeds of poorly sorted waterlaid sand. The composite till stratigraphy indicates ‘punctuated aggradation’ where the subglacial bed was built up incrementally by the repeated ‘immobilization’ of deforming overpressured till layers. Boulders and sands indicate pauses in subglacial aggradation marked by sluggish sheet flows of water that reworked the top of the underlying till. Interbeds are laterally extensive and correlated using downhole electrical conductivity, core recovery and natural gamma data. A 3-D finite element model (FEFLOW) using data from 200 cored and geophysically logged boreholes, and a large digital water well dataset of 3400 individual records shows that the till functions as a ‘leaky aquitard’ as a consequence of water flow through interbeds. It is proposed that interbeds played a similar role in the subglacial hydraulic system below the Laurentide Ice Sheet by allowing drainage of excess porewater pressures in deforming sediment and promoting deposition of till. This is in agreement with theoretical studies of deforming bed dynamics and observations at modern glaciers where porewater in the deforming layer is discharged into underlying aquifers. In this way, the presence of interbeds may be fundamental in retarding downglacier transport of deforming bed material thereby promoting the build-up of thick subglacial till successions.  相似文献   
63.
Seasonal forecasts for Yangtze River basin rainfall in June, May–June–July (MJJ), and June–July–August (JJA) 2020 are presented, based on the Met Office GloSea5 system. The three-month forecasts are based on dynamical predictions of an East Asian Summer Monsoon (EASM) index, which is transformed into regional-mean rainfall through linear regression. The June rainfall forecasts for the middle/lower Yangtze River basin are based on linear regression of precipitation. The forecasts verify well in terms of giving strong, consistent predictions of above-average rainfall at lead times of at least three months. However, the Yangtze region was subject to exceptionally heavy rainfall throughout the summer period, leading to observed values that lie outside the 95% prediction intervals of the three-month forecasts. The forecasts presented here are consistent with other studies of the 2020 EASM rainfall, whereby the enhanced mei-yu front in early summer is skillfully forecast, but the impact of midlatitude drivers enhancing the rainfall in later summer is not captured. This case study demonstrates both the utility of probabilistic seasonal forecasts for the Yangtze region and the potential limitations in anticipating complex extreme events driven by a combination of coincident factors.  相似文献   
64.
65.
ABSTRACT

The analysis of geographically referenced data, specifically point data, is predicated on the accurate geocoding of those data. Geocoding refers to the process in which geographically referenced data (addresses, for example) are placed on a map. This process may lead to issues with positional accuracy or the inability to geocode an address. In this paper, we conduct an international investigation into the impact of the (in)ability to geocode an address on the resulting spatial pattern. We use a variety of point data sets of crime events (varying numbers of events and types of crime), a variety of areal units of analysis (varying the number and size of areal units), from a variety of countries (varying underlying administrative systems), and a locally-based spatial point pattern test to find the levels of geocoding match rates to maintain the spatial patterns of the original data when addresses are missing at random. We find that the level of geocoding success depends on the number of points and the number of areal units under analysis, but generally show that the necessary levels of geocoding success are lower than found in previous research. This finding is consistent across different national contexts.  相似文献   
66.
The use of data‐driven modelling techniques to deliver improved suspended sediment rating curves has received considerable interest in recent years. Studies indicate an increased level of performance over traditional approaches when such techniques are adopted. However, closer scrutiny reveals that, unlike their traditional counterparts, data‐driven solutions commonly include lagged sediment data as model inputs, and this seriously limits their operational application. In this paper, we argue the need for a greater degree of operational reasoning underpinning data‐driven rating curve solutions and demonstrate how incorrect conclusions about the performance of a data‐driven modelling technique can be reached when the model solution is based upon operationally invalid input combinations. We exemplify the problem through the re‐analysis and augmentation of a recent and typical published study, which uses gene expression programming to model the rating curve. We compare and contrast the previously published solutions, whose inputs negate their operational application, with a range of newly developed and directly comparable traditional and data‐driven solutions, which do have operational value. Results clearly demonstrate that the performance benefits of the published gene expression programming solutions are dependent on the inclusion of operationally limiting, lagged data inputs. Indeed, when operationally inapplicable input combinations are discounted from the models and the analysis is repeated, gene expression programming fails to perform as well as many simpler, more standard multiple linear regression, piecewise linear regression and neural network counterparts. The potential for overstatement of the benefits of the data‐driven paradigm in rating curve studies is thus highlighted. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   
67.
68.
南极数字高程模型DEMs(Digital Elevation Models)是研究极区大气环流模式,南极冰盖动态变化和南极科学考察非常重要的基础数据。目前,科学家已经发布了五种不同的南极数字表面高程模型。这些数据都是由卫星雷达高度计,激光雷达和部分地面实测数据等制作而成。尽管如此,由于海洋与冰盖交接的南极冰盖边缘区随时间的快速变化,有必要根据新的卫星数据及时更新南极冰盖表面高程数据。因此,我们利用雷达高度计数据(Envisat RA-2)和激光雷达数据(ICESat/GLAS)制作了最新的南极冰盖高程数据。为提高ICESat/GLAS数据的精度,本文采用了五种不同的质量控制指标对GLAS数据进行处理,滤除了8.36%的不合格数据。这五种质量控制指标分别针对卫星定位误差、大气前向散射、饱和度及云的影响。同时,对Envisat RA-2数据进行干湿对流层纠正、电离层纠正、固体潮汐纠正和极潮纠正。针对两种不同的测高数据,提出了一种基于Envisat RA-2和GLAS数据光斑脚印几何相交的高程相对纠正方法,即通过分析GLAS脚印点与Envisat RA-2数据中心点重叠的点对,建立这些相交点对的高度差(GLAS-RA-2)与表征地形起伏的粗糙度之间的相关关系,对具有稳定相关关系的点对进行Envisat RA-2数据的相对纠正。通过分析南极冰盖不同区域的测高点密度,确定最终DEM的分辨率为1000 m。考虑到南极普里兹湾和内陆地区的差异性,将南极冰盖分为16个区,利用半方差分析确定最佳插值模型和参数,采用克吕金插值方法生成了1000 m分辨率的南极冰盖高程数据。利用两种机载激光雷达数据和我国多次南极科考实测的GPS数据对新的南极DEM进行了验证。结果显示,新的DEM与实测数据的差值范围为3.21—27.84 m,其误差分布与坡度密切关系。与国际上发布的南极DEM数据相比,新的DEM在坡度较大地区和快速变化的冰盖边缘地区精度有较大改进。  相似文献   
69.
There is a no lack of significant open questions in the field of hydrology. How will hydrological connectivity between freshwater bodies be altered by future human alterations to the hydrological cycle? Where does water go when it rains? Or what is the future space–time variability of flood and drought events? However, the answers to these questions will vary with location due to the specific and often poorly understood local boundary conditions and system properties that control the functional behaviour of a catchment or any other hydrologic control volume. We suggest that an open, shared and evolving perceptual model of a region's hydrology is critical to tailor our science questions, as it would be for any other study domain from the plot to the continental scale. In this opinion piece, we begin to discuss the elements of and point out some knowledge gaps in the perceptual model of the terrestrial water cycle of Great Britain. We discuss six major knowledge gaps and propose four key ways to reduce them. While the specific knowledge gaps in our perceptual model do not necessarily transfer to other places, we believe that the development of such perceptual models should be at the core of the debate for all hydrologic communities, and we encourage others to have a similar debate for their hydrologic domain.  相似文献   
70.
Forward calculations of magnetic anomalies caused by two-dimensional bodies of any shape and magnetic properties may be performed either without considering demagnetization as in the equivalent source technique or taking demagnetization into account as in the volume integral equation (VIE) approach, in which, for this purpose, magnetized bodies are divided into a set of rectangular prismatic cells. Ignoring demagnetization may result in distortion of the shape and the amplitude of an anomaly, whereas rectangular cells may not be an optimal representation of the source. Moreover, an inaccurate form approximation in the VIE technique may lead to inconsistent results in the near-body region. In this paper, a method is proposed, based on the VIE approach but differing by applying triangular elementary cells. The method largely overcomes the above-mentioned limitations of the VIE technique. It allows us to delineate large and complex structures exactly and only requires the source to be divided into a few elementary cells to take demagnetization into account satisfactorily. These improvements have been attained through analytical calculation of the Green's function in the complex plane, using the theory of the Cauchy-type integral. Comparing numerical solutions with analytical solutions for homogeneous elliptic cylinders without remanence, the method is found to be consistent with the theory in the range of relative magnetic permeability of 2–20, not only far from but also at subcell distances from the body. The method is appropriate for modelling highly and inhomogeneously magnetized 2D bodies of any shape. It may be of value in interpreting underground measurements or topographic effects, as well as in modelling regional geomagnetic profiles, and it is also a convenient tool for testing questionable geological hypotheses. In the framework of the method, the gravitational anomaly for the same causative bodies can be easily calculated. However, at higher and geologically uncommon values of relative magnetic permeability, the algorithm may become unstable but may be stabilized with SVD regularization. The fact that discrepancies were found with the method employed is a basis for further research.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号