首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   30937篇
  免费   467篇
  国内免费   381篇
测绘学   791篇
大气科学   2811篇
地球物理   6359篇
地质学   10776篇
海洋学   2378篇
天文学   6754篇
综合类   70篇
自然地理   1846篇
  2020年   179篇
  2019年   196篇
  2018年   492篇
  2017年   483篇
  2016年   690篇
  2015年   449篇
  2014年   676篇
  2013年   1405篇
  2012年   739篇
  2011年   1026篇
  2010年   873篇
  2009年   1240篇
  2008年   1058篇
  2007年   942篇
  2006年   1043篇
  2005年   874篇
  2004年   848篇
  2003年   871篇
  2002年   867篇
  2001年   746篇
  2000年   788篇
  1999年   660篇
  1998年   629篇
  1997年   670篇
  1996年   577篇
  1995年   541篇
  1994年   482篇
  1993年   427篇
  1992年   420篇
  1991年   416篇
  1990年   422篇
  1989年   398篇
  1988年   380篇
  1987年   466篇
  1986年   435篇
  1985年   464篇
  1984年   558篇
  1983年   560篇
  1982年   501篇
  1981年   490篇
  1980年   447篇
  1979年   433篇
  1978年   447篇
  1977年   394篇
  1976年   355篇
  1975年   355篇
  1974年   405篇
  1973年   389篇
  1972年   245篇
  1971年   224篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
191.
This study aims to illustrate how remotely sensed oceanic variables and fishing operations data can be used to predict suitable habitat of fishery resources in Geographic Information System. We used sea surface height anomaly (SSHa), sea surface temperature (SST), chlorophyll concentration (CC), photosynthetically active radiation (PAR) and fishing depth as predictor variables. Fishery data of Indian squid (Loligo spp.) and catfish (Tachysurus spp.) for study period (1998–2004) were segregated randomly to create training and validation. Catch was normalized into Catch per unit Effort (kg h?1). Generalized additive modelling was performed on training data and then tested on validation data. Suitable ranges of SST, CC, SSHa and PAR for different species distributions were derived and integrated to predict their spatial distributions. Results indicated good match between predicted and actual catch. Monthly probability maps of predicted habitat areas coincide with high catch of the particular month for the study period.  相似文献   
192.
The development and application of an algorithm to compute Köppen‐Geiger climate classifications from the Coupled Model Intercomparison Project (CMIP) and Paleo Model Intercomparison Project (PMIP) climate model simulation data is described in this study. The classification algorithm was applied to data from the PMIP III paleoclimate experiments for the Last Glacial Maximum, 21k years before present (yBP), Mid‐Holocene (6k yBP) and the Pre‐Industrial (0k yBP, control run) time slices. To infer detailed classification maps, the simulation datasets were interpolated to a higher resolution. The classification method presented is based on the application of Open Source Software, and the implementation is described with attention to detail. The source code and the exact input data sets as well as the resulting data sets are provided to enable the application of the presented approach.  相似文献   
193.
194.
Coastal zone assumes importance due to high productivity of ecosystems, man-made developmental activities, natural hazards and dynamic nature of the coast. As costal ecosystems are unique and fragile, understanding the impact of developmental activities on the sustainability of the coastal zone is very important. Remote sensing, because of repetitive and synoptic nature is an ideal tool for studying this. Time series data analyses for monitoring coastal zone require different type of sensors. Present study deals with atmospheric correction of satellite data, reflectance, selection of coastal features like, mudflat, mangroves, vegetated dune, coastal water, etc. and their inter-comparison using different sensor data of RESOURCESAT sensors. Reflectance values give better separateability for various coastal features in comparison to DN values. LISS IV can be used in place of LISS III or merged (LISS III + PAN) for long-term coastal zone studies.  相似文献   
195.
Accurate upward continuation of gravity anomalies supports future precision, free-inertial navigation systems, since the latter cannot by themselves sense the gravitational field and thus require appropriate gravity compensation. This compensation is in the form of horizontal gravity components. An analysis of the model errors in upward continuation using derivatives of the standard Pizzetti integral solution (spherical approximation) shows that discretization of the data and truncation of the integral are the major sources of error in the predicted horizontal components of the gravity disturbance. The irregular shape of the data boundary, even the relatively rough topography of a simulated mountainous region, has only secondary effect, except when the data resolution is very high (small discretization error). Other errors due to spherical approximation are even less important. The analysis excluded all measurement errors in the gravity anomaly data in order to quantify just the model errors. Based on a consistent gravity field/topographic surface simulation, upward continuation errors in the derivatives of the Pizzetti integral to mean altitudes of about 3,000 and 1,500 m above the mean surface ranged from less than 1 mGal (standard deviation) to less than 2 mGal (standard deviation), respectively, in the case of 2 arcmin data resolution. Least-squares collocation performs better than this, but may require significantly greater computational resources.  相似文献   
196.
Time-series of zenith wet and total troposphere delays as well as north and east gradients are compared, and zenith total delays (ZTD) are combined on the level of parameter estimates. Input data sets are provided by ten Analysis Centers (ACs) of the International VLBI Service for Geodesy and Astrometry (IVS) for the CONT08 campaign (12?C26 August 2008). The inconsistent usage of meteorological data and models, such as mapping functions, causes systematics among the ACs, and differing parameterizations and constraints add noise to the troposphere parameter estimates. The empirical standard deviation of ZTD among the ACs with regard to an unweighted mean is 4.6?mm. The ratio of the analysis noise to the observation noise assessed by the operator/software impact (OSI) model is about 2.5. These and other effects have to be accounted for to improve the intra-technique combination of VLBI-derived troposphere parameters. While the largest systematics caused by inconsistent usage of meteorological data can be avoided and the application of different mapping functions can be considered by applying empirical corrections, the noise has to be modeled in the stochastic model of intra-technique combination. The application of different stochastic models shows no significant effects on the combined parameters but results in different mean formal errors: the mean formal errors of the combined ZTD are 2.3?mm (unweighted), 4.4?mm (diagonal), 8.6?mm [variance component (VC) estimation], and 8.6?mm (operator/software impact, OSI). On the one hand, the OSI model, i.e. the inclusion of off-diagonal elements in the cofactor-matrix, considers the reapplication of observations yielding a factor of about two for mean formal errors as compared to the diagonal approach. On the other hand, the combination based on VC estimation shows large differences among the VCs and exhibits a comparable scaling of formal errors. Thus, for the combination of troposphere parameters a combination of the two extensions of the stochastic model is recommended.  相似文献   
197.
Tomographic 4D reconstructions of ionospheric anomalies appearing in the high-latitude polar cap region are compared with plasma density measurements by digital ionosonde located near the north magnetic pole at Eureka station and with in situ plasma measurements on-board DMSP spacecraft. The moderate magnetic storm of 14–17 October 2002 is taken as an example of a geomagnetic disturbance which generates large-scale ionospheric plasma anomalies at mid-latitudes and in the polar cap region. Comparison of the GPS tomographic reconstructions over Eureka station with the ionosonde measurements of the F layer peak densities indicates that the GPS tomography correctly predicts the time of arrival and passage of the ionospheric tongue of ionization over the magnetic pole area, although the tomographic technique appears to under-estimate the value of F peak plasma density. Comparison with the in situ plasma measurements by the DMSP SSIES instruments shows that the GPS tomography correctly reproduces the large-scale spatial structure of ionospheric anomalies over a wide range of latitudes from mid-latitudes to the high-latitude polar cap region, though the tomographic reconstructions tend to over-estimate the density of the topside ionosphere at 840 km DMSP orbit. This study is essential for understanding the quality and limitations of the tomographic reconstruction techniques, particularly in high-latitude regions where GPS TEC measurements and other ionospheric data sources are limited.  相似文献   
198.
A realistic assessment of the total uncertainty budget of Global Positioning System (GPS) observations and its adequate mathematical treatment is a basic requirement for all analysis and interpretation of GPS-derived point positions, in particular GPS heights, and their respective changes. This implies not only the random variability but also the remaining systematic errors. At present in geodesy, the main focus is on stochastic approaches in which errors are modeled by means of random variables. Here, an alternative approach based on interval mathematics is presented. It allows us to model and to quantify the impact of remaining systematic errors in GPS carrier-phase observations on the final results using deterministic error bands. In this paper, emphasis is given to the derivation of the observation intervals based on influence parameters and to the study of the complex linear transfer of this type of uncertainty to estimated point positions yielding zonotopes. From the presented simulation studies of GPS baselines, it turns out that the uncertainty due to remaining systematic effects dominates the total uncertainty budget for baselines longer than 200 km.  相似文献   
199.
A four-component decomposition scheme of the coherency matrix is presented here for the analysis of polarimetric synthetic aperture radar (SAR) images. The coherency matrix is used to deal with nonreflection symmetric scattering case, which is an extension of covariance matrix approach. The same decomposition results have been obtained. The advantage of this approach is explicit expressions of four scattering powers in terms of scattering matrix elements, which serve the interpretation of polarimetric SAR data quantitatively.  相似文献   
200.
This letter investigates the edge effects on the tree height retrieval over coniferous plantations using X-band interferometry. A coherent version of the water cloud model is used to evaluate the influence of observation conditions such as incidence angle, tree height, and slope, each of which affects the extent of areas affected by edge effects. Results from the model simulation are discussed in the context of actual X-band data over pine plantations. A generic expression to indicate the extent of edge effects is described.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号