首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   30851篇
  免费   555篇
  国内免费   381篇
测绘学   791篇
大气科学   2814篇
地球物理   6356篇
地质学   10778篇
海洋学   2378篇
天文学   6755篇
综合类   70篇
自然地理   1845篇
  2020年   179篇
  2019年   196篇
  2018年   494篇
  2017年   481篇
  2016年   690篇
  2015年   449篇
  2014年   675篇
  2013年   1406篇
  2012年   740篇
  2011年   1027篇
  2010年   874篇
  2009年   1241篇
  2008年   1060篇
  2007年   941篇
  2006年   1044篇
  2005年   874篇
  2004年   849篇
  2003年   869篇
  2002年   867篇
  2001年   746篇
  2000年   788篇
  1999年   660篇
  1998年   629篇
  1997年   666篇
  1996年   575篇
  1995年   541篇
  1994年   483篇
  1993年   427篇
  1992年   420篇
  1991年   416篇
  1990年   422篇
  1989年   398篇
  1988年   380篇
  1987年   466篇
  1986年   435篇
  1985年   464篇
  1984年   558篇
  1983年   560篇
  1982年   501篇
  1981年   490篇
  1980年   447篇
  1979年   433篇
  1978年   447篇
  1977年   394篇
  1976年   355篇
  1975年   355篇
  1974年   405篇
  1973年   389篇
  1972年   245篇
  1971年   224篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
961.
A key to understanding Late Pleistocene megafaunal extinction dynamics is knowledge of megafaunal ecological response(s) to long-term environmental perturbations. Strategically, that requires targeting fossil deposits that accumulated during glacial and interglacial intervals both before and after human arrival, with subsequent palaeoecological models underpinned by robust and reliable chronologies. Late Pleistocene vertebrate fossil localities from the Darling Downs, eastern Australia, provide stratigraphically-intact, abundant megafaunal sequences, which allows for testing of anthropogenic versus climate change megafauna extinction hypotheses. Each stratigraphic unit at site QML796, Kings Creek Catchment, was previously shown to have had similar sampling potential, and the basal units contain both small-sized taxa (e.g., land snails, frogs, bandicoots, rodents) and megafauna. Importantly, sequential faunal horizons show stepwise decrease in taxonomic diversity with the loss of some, but not all, megafauna in the geographically-small palaeocatchment. The purpose of this paper is to present the results of our intensive, multidisciplinary dating study of the deposits (>40 dates). Dating by means of accelerator mass spectrometry (AMS) 14C (targeting bone, freshwater molluscs, and charcoal) and thermal ionisation mass spectrometry U/Th (targeting teeth and freshwater molluscs) do not agree with each other and, in the case of AMS 14C dating, lack internal consistency. Scanning electron microscopy and rare earth element analyses demonstrate that the dated molluscs are diagenetically altered and contain aragonite cements that incorporated secondary young C, suggesting that such dates should be regarded as minimum ages. AMS 14C dated charcoals provide ages that occur out of stratigraphic order, and cluster in the upper chronological limits of the technique (~40–48 ka). Again, we suggest that such results should be regarded as suspicious and only minimum ages. Subsequent OSL and U/Th (teeth) dating provide complimentary results and demonstrate that the faunal sequences actually span ~120–83 ka, thus occurring beyond the AMS 14C dating window. Importantly, the dates suggest that the local decline in biological diversity was initiated ~75,000 years before the colonisation of humans on the continent. Collectively, the data are most parsimoniously consistent with a pre-human climate change model for local habitat change and megafauna extinction, but not with a nearly simultaneous extinction of megafauna as required by the human-induced blitzkrieg extinction hypothesis. This study demonstrates the problems inherent in dating deposits that lie near the chronological limits of the radiocarbon dating technique, and highlights the need to cross-check previously-dated archaeological and megafauna deposits within the timeframe of earliest human colonisation and latest megafaunal survival.  相似文献   
962.
Hydrological connectivity describes the physical coupling (linkages) of different elements within a landscape regarding (sub‐) surface flows. A firm understanding of hydrological connectivity is important for catchment management applications, for example, habitat and species protection, and for flood resistance and resilience improvement. Thinking about (geomorphological) systems as networks can lead to new insights, which has also been recognized within the scientific community, seeing the recent increase in the use of network (graph) theory within the geosciences. Network theory supports the analysis and understanding of complex systems by providing data structures for modelling objects and their linkages, and a versatile toolbox to quantitatively appraise network structure and properties. The objective of this study was to characterize and quantify overland flow connectivity dynamics on hillslopes in a humid sub‐Mediterranean environment by using a combination of high‐resolution digital‐terrain models, overland flow sensors and a network approach. Results showed that there are significant differences between overland flow connectivity on agricultural areas and semi‐natural shrubs areas. Significant positive correlations between connectivity and precipitation characteristics were found. Significant negative correlations between connectivity and soil moisture were found, most likely because of soil water repellency and/or soil surface crusting. The combination of structural networks and dynamic networks for determining potential connectivity and actual connectivity proved a powerful tool for analysing overland flow connectivity. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   
963.
Historically, observing snow depth over large areas has been difficult. When snow depth observations are sparse, regression models can be used to infer the snow depth over a given area. Data sparsity has also left many important questions about such inference unexamined. Improved inference, or estimation, of snow depth and its spatial distribution from a given set of observations can benefit a wide range of applications from water resource management, to ecological studies, to validation of satellite estimates of snow pack. The development of Light Detection and Ranging (LiDAR) technology has provided non‐sparse snow depth measurements, which we use in this study, to address fundamental questions about snow depth inference using both sparse and non‐sparse observations. For example, when are more data needed and when are data redundant? Results apply to both traditional and manual snow depth measurements and to LiDAR observations. Through sampling experiments on high‐resolution LiDAR snow depth observations at six separate 1.17‐km2 sites in the Colorado Rocky Mountains, we provide novel perspectives on a variety of issues affecting the regression estimation of snow depth from sparse observations. We measure the effects of observation count, random selection of observations, quality of predictor variables, and cross‐validation procedures using three skill metrics: percent error in total snow volume, root mean squared error (RMSE), and R2. Extremes of predictor quality are used to understand the range of its effect; how do predictors downloaded from internet perform against more accurate predictors measured by LiDAR? Whereas cross validation remains the only option for validating inference from sparse observations, in our experiments, the full set of LiDAR‐measured snow depths can be considered the ‘true’ spatial distribution and used to understand cross‐validation bias at the spatial scale of inference. We model at the 30‐m resolution of readily available predictors, which is a popular spatial resolution in the literature. Three regression models are also compared, and we briefly examine how sampling design affects model skill. Results quantify the primary dependence of each skill metric on observation count that ranges over three orders of magnitude, doubling at each step from 25 up to 3200. Whereas uncertainty (resulting from random selection of observations) in percent error of true total snow volume is typically well constrained by 100–200 observations, there is considerable uncertainty in the inferred spatial distribution (R2) even at medium observation counts (200–800). We show that percent error in total snow volume is not sensitive to predictor quality, although RMSE and R2 (measures of spatial distribution) often depend critically on it. Inaccuracies of downloaded predictors (most often the vegetation predictors) can easily require a quadrupling of observation count to match RMSE and R2 scores obtained by LiDAR‐measured predictors. Under cross validation, the RMSE and R2 skill measures are consistently biased towards poorer results than their true validations. This is primarily a result of greater variance at the spatial scales of point observations used for cross validation than at the 30‐m resolution of the model. The magnitude of this bias depends on individual site characteristics, observation count (for our experimental design), and sampling design. Sampling designs that maximize independent information maximize cross‐validation bias but also maximize true R2. The bagging tree model is found to generally outperform the other regression models in the study on several criteria. Finally, we discuss and recommend use of LiDAR in conjunction with regression modelling to advance understanding of snow depth spatial distribution at spatial scales of thousands of square kilometres. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   
964.
Along with the applicability of optimization algorithms, there are lots of features that can affect the functioning of the optimization techniques. The main purpose of this paper is investigating the significance of boundary constraint handling (BCH) schemes on the performance of optimization algorithms. To this end, numbers of deterministic and probabilistic BCH approaches are applied to one of the most recent proposed optimization techniques, named interior search algorithm (ISA). Apart from the implementing different BCH methods, a sensitivity analysis is conducted to find an appropriate setting for the only parameter of ISA. Concrete cantilever retaining wall design as one of the most important geotechnical problems is tackled to declare proficiency of the ISA algorithm, on the one hand, and benchmark the effect of BCH schemes on the final results, on the contrary. As results demonstrate, various BCH approaches have a perceptible impact on the algorithm performance. In like manner, the essential parameter of ISA can also play a pivotal role in this algorithm's efficiency. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   
965.
Large water‐sample sets collected from 1899 through 1902, 1907, and in the early 1950s allow comparisons of pre‐impoundment and post‐impoundment (1969 through 2008) nitrogen concentrations in the lower Missouri River. Although urban wastes were not large enough to detectably increase annual loads of total nitrogen at the beginning of the 20th century, carcass waste, stock‐yard manure, and untreated human wastes measurably increased ammonia and organic‐nitrogen concentrations during low flows. Average total‐nitrogen concentrations in both periods were about 2.5 mg/l, but much of the particulate‐organic nitrogen, which was the dominant form of nitrogen around 1900, has been replaced by nitrate. This change in speciation was caused by the nearly 80% decrease in suspended‐sediment concentrations that occurred after impoundment, modern agriculture, drainage of riparian wetlands, and sewage treatment. Nevertheless, bioavailable nitrogen has not been low enough to limit primary production in the Missouri River since the beginning of the 20th century. Nitrate concentrations have increased more rapidly from 2000 through 2008 (5 to 12% per year), thus increasing bioavailable nitrogen delivered to the Mississippi River and affecting Gulf Coast hypoxia. The increase in nitrate concentrations with distance downstream is much greater during the post‐impoundment period. If strategies to decrease total‐nitrogen loads focus on particulate N, substantial decreases will be difficult because particulate nitrogen is now only 23% of total nitrogen in the Missouri River. A strategy aimed at decreasing particulates also could further exacerbate land loss along the Gulf of Mexico, which has been sediment starved since Missouri River impoundment. In contrast, strategies or benchmarks aimed at decreasing nitrate loads could substantially decrease nitrogen loadings because nitrates now constitute over half of the Missouri's nitrogen input to the Mississippi. Ongoing restoration and creation of wetlands along the Missouri River could be part of such a nitrate‐reduction strategy. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.  相似文献   
966.
967.
968.
969.
970.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号