首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   80535篇
  免费   1194篇
  国内免费   557篇
测绘学   1834篇
大气科学   6677篇
地球物理   16678篇
地质学   28035篇
海洋学   6732篇
天文学   17080篇
综合类   180篇
自然地理   5070篇
  2020年   655篇
  2019年   712篇
  2018年   1420篇
  2017年   1301篇
  2016年   1773篇
  2015年   1151篇
  2014年   1755篇
  2013年   3782篇
  2012年   1883篇
  2011年   2748篇
  2010年   2424篇
  2009年   3456篇
  2008年   3123篇
  2007年   2877篇
  2006年   2860篇
  2005年   2391篇
  2004年   2473篇
  2003年   2265篇
  2002年   2172篇
  2001年   1961篇
  2000年   1946篇
  1999年   1628篇
  1998年   1647篇
  1997年   1623篇
  1996年   1416篇
  1995年   1336篇
  1994年   1210篇
  1993年   1118篇
  1992年   1116篇
  1991年   1018篇
  1990年   1198篇
  1989年   1012篇
  1988年   928篇
  1987年   1149篇
  1986年   996篇
  1985年   1200篇
  1984年   1382篇
  1983年   1345篇
  1982年   1244篇
  1981年   1202篇
  1980年   1048篇
  1979年   1053篇
  1978年   1071篇
  1977年   976篇
  1976年   906篇
  1975年   872篇
  1974年   867篇
  1973年   883篇
  1972年   589篇
  1971年   537篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
521.
Summary Daily pluviometric records of 43 meteorological stations across the Iberian Peninsula have permitted a detailed analysis of dry spell patterns for the period 1951–2000 by distinguishing daily amount thresholds of 0.1, 1.0, 5.0 and 10.0 mm/day. The analyses are based on three annual series, namely the number of dry spells, N, the average dry spell length, L, and the extreme dry spell length, L max. First, the statistical significance of local trends for the annual series of N, L and L max has been investigated by means of the Mann-Kendall test and significant field trends have been established by means of Monte Carlo simulations. Clear signs of negative field trends are detected for N (1.0 and 10.0 mm/day) and L (0.1 mm/day). Second, the Weibull model fits well the empirical distributions of dry spell lengths for all the rain gauges, whatever the daily amount threshold, with a well ranged spatial distribution of their parameters u and k. On the basis of the Weibull distribution, return period maps for 2, 5, 10, 25 and 50 years have been obtained for dry spell lengths with respect to the four daily threshold levels. While for 0.1 and 1.0 mm/day the longest dry spells are expected at the south of the Iberian Peninsula, for 5.0 and 10.0 mm/day they are mostly detected at the southeast. Finally, the elapsed time between consecutive dry spells has been analysed by considering the same rain amount thresholds and different dry spell lengths at increasing intervals of 10 days. This analysis makes evident a significant negative field trend of the elapsed time between consecutive dry spells of lengths ranging from 10 to 20 days for daily amount thresholds of 1.0, 5.0 and 10.0 mm/day. Authors’ addresses: X. Lana, C. Serra, Departament de Física i Enginyeria Nuclear, ETSEIB, Universitat Politècnica de Catalunya, Av. Diagonal 647 planta 11, 08028 Barcelona, Spain; M. D. Marínez, Departament de Física Aplicada, Universitat Politècnica de Catalunya, 08028 Barcelona, Spain; A. Burgue?o, Departament de Meteorologia i Astronomia, Universitat de Barcelona, 08028 Barcelona, Spain; J. Martín-Vide, L. Gómez, Grup de Climatologia, Universitat de Barcelona, 08028 Barcelona, Spain.  相似文献   
522.
This research proposed a parallelized approach to scaling up the calculation of inundation height, the minimum sea‐level rise required to inundate a cell on a digital elevation model, which is based on Dijkstra's algorithm for shortest‐path calculations on a graph. Our approach is based on the concepts of spatial decomposition, calculate‐and‐correct, and a master/worker parallelization paradigm. The approach was tested using the U.S. Coastal Relief Model (CRM) dataset from the National Geophysical Data Center on a multicore desktop computer and various supercomputing resources through the U.S. Extreme Science and Engineering Discovery Environment (XSEDE) program. Our parallel implementation not only enables computations that were larger than previously possible, but also significantly outperforms serial implementations with respect to running time and memory footprint as the number of processing cores increases. The efficiency of the scalability seemed to be tied to tile size and flattened out at a certain number of workers.  相似文献   
523.
524.
We present an improved mascon approach to transform monthly spherical harmonic solutions based on GRACE satellite data into mass anomaly estimates in Greenland. The GRACE-based spherical harmonic coefficients are used to synthesize gravity anomalies at satellite altitude, which are then inverted into mass anomalies per mascon. The limited spectral content of the gravity anomalies is properly accounted for by applying a low-pass filter as part of the inversion procedure to make the functional model spectrally consistent with the data. The full error covariance matrices of the monthly GRACE solutions are properly propagated using the law of covariance propagation. Using numerical experiments, we demonstrate the importance of a proper data weighting and of the spectral consistency between functional model and data. The developed methodology is applied to process real GRACE level-2 data (CSR RL05). The obtained mass anomaly estimates are integrated over five drainage systems, as well as over entire Greenland. We find that the statistically optimal data weighting reduces random noise by 35–69%, depending on the drainage system. The obtained mass anomaly time-series are de-trended to eliminate the contribution of ice discharge and are compared with de-trended surface mass balance (SMB) time-series computed with the Regional Atmospheric Climate Model (RACMO 2.3). We show that when using a statistically optimal data weighting in GRACE data processing, the discrepancies between GRACE-based estimates of SMB and modelled SMB are reduced by 24–47%.  相似文献   
525.
526.
A functional model for a bundle block adjustment in the inertial reference frame was developed, implemented and tested. This approach enables the determination of rotation parameters of planetary bodies on the basis of photogrammetric observations. Tests with a self-consistent synthetic data set showed that the implementation converges reliably toward the expected values of the introduced unknown parameters of the adjustment, e.g., spin pole orientation, and that it can cope with typical observational errors in the data. We applied the model to a data set of Phobos using images from the Mars Express and the Viking mission. With Phobos being in a locked rotation, we computed a forced libration amplitude of \(1.14^\circ \pm 0.03^\circ \) together with a control point network of 685 points.  相似文献   
527.
528.
529.
Land managers responsible for invasive species removal in the USA require tools to prevent the Asian longhorned beetle (Anoplophora glabripennis) (ALB) from decimating the maple-dominant hardwood forests of Massachusetts and New England. Species distribution models (SDMs) and spread models have been applied individually to predict the invasion distribution and rate of spread, but the combination of both models can increase the accuracy of predictions of species spread over time when habitat suitability is heterogeneous across landscapes. First, a SDM was fit to 2008 ALB presence-only locations. Then, a stratified spread model was generated to measure the probability of spread due to natural and human causes. Finally, the SDM and spread models were combined to evaluate the risk of ALB spread in Central Massachusetts in 2008–2009. The SDM predicted many urban locations in Central Massachusetts as having suitable environments for species establishment. The combined model shows the greatest risk of spread and establishment in suitable locations immediately surrounding the epicentre of the ALB outbreak in Northern Worcester with lower risk areas in suitable locations only accessible through long-range dispersal from access to human transportation networks. The risk map achieved an accuracy of 67% using 2009 ALB locations for model validation. This model framework can effectively provide risk managers with valuable information concerning the timing and spatial extent of spread/establishment risk of ALB and potential strategies needed for effective future risk management efforts.  相似文献   
530.
Regional and national level land cover datasets, such as the National Land Cover Database (NLCD) in the United States, have become an important resource in physical and social science research. Updates to the NLCD have been conducted every 5 years since 2001; however, the procedure for producing a new release is labor-intensive and time-consuming, taking 3 or 4 years to complete. Furthermore, in most countries very few, if any, such releases exist, and thus there is high demand for efficient production of land cover data at different points in time. In this paper, an active machine learning framework for temporal updating (or backcasting) of land cover data is proposed and tested for three study sites covered by the NLCD. The approach employs a maximum entropy classifier to extract information from one Landsat image using the NLCD, and then replicate the classification on a Landsat image for the same geographic extent from a different point in time to create land cover data of similar quality. Results show that this framework can effectively replicate the land cover database in the temporal domain with similar levels of overall and within class agreement when compared against high resolution reference land cover datasets. These results demonstrate that the land cover information encapsulated in the NLCD can effectively be extracted using solely Landsat imagery for replication purposes. The algorithm is fully automated and scalable for applications at landscape and regional scales for multiple points in time.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号