首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   31319篇
  免费   465篇
  国内免费   384篇
测绘学   796篇
大气科学   2853篇
地球物理   6444篇
地质学   10905篇
海洋学   2403篇
天文学   6817篇
综合类   71篇
自然地理   1879篇
  2020年   182篇
  2019年   199篇
  2018年   498篇
  2017年   485篇
  2016年   695篇
  2015年   452篇
  2014年   681篇
  2013年   1426篇
  2012年   751篇
  2011年   1037篇
  2010年   887篇
  2009年   1258篇
  2008年   1071篇
  2007年   963篇
  2006年   1055篇
  2005年   890篇
  2004年   853篇
  2003年   883篇
  2002年   878篇
  2001年   750篇
  2000年   792篇
  1999年   664篇
  1998年   635篇
  1997年   672篇
  1996年   583篇
  1995年   548篇
  1994年   491篇
  1993年   433篇
  1992年   422篇
  1991年   422篇
  1990年   431篇
  1989年   404篇
  1988年   388篇
  1987年   473篇
  1986年   443篇
  1985年   472篇
  1984年   562篇
  1983年   566篇
  1982年   509篇
  1981年   497篇
  1980年   453篇
  1979年   441篇
  1978年   454篇
  1977年   398篇
  1976年   363篇
  1975年   363篇
  1974年   409篇
  1973年   394篇
  1972年   246篇
  1971年   225篇
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
191.
Abstract

Shoreline extraction is fundamental and inevitable for several studies. Ascertaining the precise spatial location of the shoreline is crucial. Recently, the need for using remote sensing data to accomplish the complex task of automatic extraction of features, such as shoreline, has considerably increased. Automated feature extraction can drastically minimize the time and cost of data acquisition and database updating. Effective and fast approaches are essential to monitor coastline retreat and update shoreline maps. Here, we present a flexible mathematical morphology-driven approach for shoreline extraction algorithm from satellite imageries. The salient features of this work are the preservation of actual size and shape of the shorelines, run-time structuring element definition, semi-automation, faster processing, and single band adaptability. The proposed approach is tested with various sensor-driven images with low to high resolutions. Accuracy of the developed methodology has been assessed with manually prepared ground truths of the study area and compared with an existing shoreline classification approach. The proposed approach is found successful in shoreline extraction from the wide variety of satellite images based on the results drawn from visual and quantitative assessments.  相似文献   
192.
This study aims to illustrate how remotely sensed oceanic variables and fishing operations data can be used to predict suitable habitat of fishery resources in Geographic Information System. We used sea surface height anomaly (SSHa), sea surface temperature (SST), chlorophyll concentration (CC), photosynthetically active radiation (PAR) and fishing depth as predictor variables. Fishery data of Indian squid (Loligo spp.) and catfish (Tachysurus spp.) for study period (1998–2004) were segregated randomly to create training and validation. Catch was normalized into Catch per unit Effort (kg h?1). Generalized additive modelling was performed on training data and then tested on validation data. Suitable ranges of SST, CC, SSHa and PAR for different species distributions were derived and integrated to predict their spatial distributions. Results indicated good match between predicted and actual catch. Monthly probability maps of predicted habitat areas coincide with high catch of the particular month for the study period.  相似文献   
193.
The development and application of an algorithm to compute Köppen‐Geiger climate classifications from the Coupled Model Intercomparison Project (CMIP) and Paleo Model Intercomparison Project (PMIP) climate model simulation data is described in this study. The classification algorithm was applied to data from the PMIP III paleoclimate experiments for the Last Glacial Maximum, 21k years before present (yBP), Mid‐Holocene (6k yBP) and the Pre‐Industrial (0k yBP, control run) time slices. To infer detailed classification maps, the simulation datasets were interpolated to a higher resolution. The classification method presented is based on the application of Open Source Software, and the implementation is described with attention to detail. The source code and the exact input data sets as well as the resulting data sets are provided to enable the application of the presented approach.  相似文献   
194.
195.
Coastal zone assumes importance due to high productivity of ecosystems, man-made developmental activities, natural hazards and dynamic nature of the coast. As costal ecosystems are unique and fragile, understanding the impact of developmental activities on the sustainability of the coastal zone is very important. Remote sensing, because of repetitive and synoptic nature is an ideal tool for studying this. Time series data analyses for monitoring coastal zone require different type of sensors. Present study deals with atmospheric correction of satellite data, reflectance, selection of coastal features like, mudflat, mangroves, vegetated dune, coastal water, etc. and their inter-comparison using different sensor data of RESOURCESAT sensors. Reflectance values give better separateability for various coastal features in comparison to DN values. LISS IV can be used in place of LISS III or merged (LISS III + PAN) for long-term coastal zone studies.  相似文献   
196.
Accurate upward continuation of gravity anomalies supports future precision, free-inertial navigation systems, since the latter cannot by themselves sense the gravitational field and thus require appropriate gravity compensation. This compensation is in the form of horizontal gravity components. An analysis of the model errors in upward continuation using derivatives of the standard Pizzetti integral solution (spherical approximation) shows that discretization of the data and truncation of the integral are the major sources of error in the predicted horizontal components of the gravity disturbance. The irregular shape of the data boundary, even the relatively rough topography of a simulated mountainous region, has only secondary effect, except when the data resolution is very high (small discretization error). Other errors due to spherical approximation are even less important. The analysis excluded all measurement errors in the gravity anomaly data in order to quantify just the model errors. Based on a consistent gravity field/topographic surface simulation, upward continuation errors in the derivatives of the Pizzetti integral to mean altitudes of about 3,000 and 1,500 m above the mean surface ranged from less than 1 mGal (standard deviation) to less than 2 mGal (standard deviation), respectively, in the case of 2 arcmin data resolution. Least-squares collocation performs better than this, but may require significantly greater computational resources.  相似文献   
197.
Time-series of zenith wet and total troposphere delays as well as north and east gradients are compared, and zenith total delays (ZTD) are combined on the level of parameter estimates. Input data sets are provided by ten Analysis Centers (ACs) of the International VLBI Service for Geodesy and Astrometry (IVS) for the CONT08 campaign (12?C26 August 2008). The inconsistent usage of meteorological data and models, such as mapping functions, causes systematics among the ACs, and differing parameterizations and constraints add noise to the troposphere parameter estimates. The empirical standard deviation of ZTD among the ACs with regard to an unweighted mean is 4.6?mm. The ratio of the analysis noise to the observation noise assessed by the operator/software impact (OSI) model is about 2.5. These and other effects have to be accounted for to improve the intra-technique combination of VLBI-derived troposphere parameters. While the largest systematics caused by inconsistent usage of meteorological data can be avoided and the application of different mapping functions can be considered by applying empirical corrections, the noise has to be modeled in the stochastic model of intra-technique combination. The application of different stochastic models shows no significant effects on the combined parameters but results in different mean formal errors: the mean formal errors of the combined ZTD are 2.3?mm (unweighted), 4.4?mm (diagonal), 8.6?mm [variance component (VC) estimation], and 8.6?mm (operator/software impact, OSI). On the one hand, the OSI model, i.e. the inclusion of off-diagonal elements in the cofactor-matrix, considers the reapplication of observations yielding a factor of about two for mean formal errors as compared to the diagonal approach. On the other hand, the combination based on VC estimation shows large differences among the VCs and exhibits a comparable scaling of formal errors. Thus, for the combination of troposphere parameters a combination of the two extensions of the stochastic model is recommended.  相似文献   
198.
Tomographic 4D reconstructions of ionospheric anomalies appearing in the high-latitude polar cap region are compared with plasma density measurements by digital ionosonde located near the north magnetic pole at Eureka station and with in situ plasma measurements on-board DMSP spacecraft. The moderate magnetic storm of 14–17 October 2002 is taken as an example of a geomagnetic disturbance which generates large-scale ionospheric plasma anomalies at mid-latitudes and in the polar cap region. Comparison of the GPS tomographic reconstructions over Eureka station with the ionosonde measurements of the F layer peak densities indicates that the GPS tomography correctly predicts the time of arrival and passage of the ionospheric tongue of ionization over the magnetic pole area, although the tomographic technique appears to under-estimate the value of F peak plasma density. Comparison with the in situ plasma measurements by the DMSP SSIES instruments shows that the GPS tomography correctly reproduces the large-scale spatial structure of ionospheric anomalies over a wide range of latitudes from mid-latitudes to the high-latitude polar cap region, though the tomographic reconstructions tend to over-estimate the density of the topside ionosphere at 840 km DMSP orbit. This study is essential for understanding the quality and limitations of the tomographic reconstruction techniques, particularly in high-latitude regions where GPS TEC measurements and other ionospheric data sources are limited.  相似文献   
199.
A realistic assessment of the total uncertainty budget of Global Positioning System (GPS) observations and its adequate mathematical treatment is a basic requirement for all analysis and interpretation of GPS-derived point positions, in particular GPS heights, and their respective changes. This implies not only the random variability but also the remaining systematic errors. At present in geodesy, the main focus is on stochastic approaches in which errors are modeled by means of random variables. Here, an alternative approach based on interval mathematics is presented. It allows us to model and to quantify the impact of remaining systematic errors in GPS carrier-phase observations on the final results using deterministic error bands. In this paper, emphasis is given to the derivation of the observation intervals based on influence parameters and to the study of the complex linear transfer of this type of uncertainty to estimated point positions yielding zonotopes. From the presented simulation studies of GPS baselines, it turns out that the uncertainty due to remaining systematic effects dominates the total uncertainty budget for baselines longer than 200 km.  相似文献   
200.
A four-component decomposition scheme of the coherency matrix is presented here for the analysis of polarimetric synthetic aperture radar (SAR) images. The coherency matrix is used to deal with nonreflection symmetric scattering case, which is an extension of covariance matrix approach. The same decomposition results have been obtained. The advantage of this approach is explicit expressions of four scattering powers in terms of scattering matrix elements, which serve the interpretation of polarimetric SAR data quantitatively.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号