首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   36427篇
  免费   647篇
  国内免费   437篇
测绘学   966篇
大气科学   3258篇
地球物理   7614篇
地质学   12764篇
海洋学   2765篇
天文学   7673篇
综合类   92篇
自然地理   2379篇
  2020年   249篇
  2019年   292篇
  2018年   637篇
  2017年   645篇
  2016年   857篇
  2015年   589篇
  2014年   862篇
  2013年   1697篇
  2012年   961篇
  2011年   1299篇
  2010年   1107篇
  2009年   1523篇
  2008年   1305篇
  2007年   1159篇
  2006年   1238篇
  2005年   1063篇
  2004年   1039篇
  2003年   1051篇
  2002年   1038篇
  2001年   832篇
  2000年   894篇
  1999年   739篇
  1998年   724篇
  1997年   744篇
  1996年   648篇
  1995年   615篇
  1994年   546篇
  1993年   500篇
  1992年   477篇
  1991年   461篇
  1990年   480篇
  1989年   452篇
  1988年   440篇
  1987年   525篇
  1986年   492篇
  1985年   528篇
  1984年   640篇
  1983年   625篇
  1982年   568篇
  1981年   542篇
  1980年   504篇
  1979年   477篇
  1978年   490篇
  1977年   439篇
  1976年   395篇
  1975年   393篇
  1974年   444篇
  1973年   421篇
  1972年   279篇
  1971年   244篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
201.
In a project to classify livestock grazing intensity using participatory geographic information systems (PGIS), we encountered the problem of how to synthesize PGIS-based maps of livestock grazing intensity that were prepared separately by local experts. We investigated the utility of evidential belief functions (EBFs) and Dempster's rule of combination to represent classification uncertainty and integrate the PGIS-based grazing intensity maps. These maps were used as individual sets of evidence in the application of EBFs to evaluate the proposition that " This area or pixel belongs to the high, medium, or low grazing intensity class because the local expert(s) says (say) so ". The class-area-weighted averages of EBFs based on each of the PGIS-based maps show that the lowest degree of classification uncertainty is associated with maps in which "vegetation species" was used as the mapping criterion. This criterion, together with local landscape attributes of livestock use may be considered as an appropriate standard measure for grazing intensity. The maps of integrated EBFs of grazing intensity show that classification uncertainty is high when the local experts apply at least two mapping criteria together. This study demonstrates the usefulness of EBFs to represent classification uncertainty and the possibility to use the EBF values in identifying and using criteria for PGIS-based mapping of livestock grazing intensity.  相似文献   
202.
Within the conceptual framework of Complex Systems, we discuss the importance and challenges in extracting and linking multiscale objects from high-resolution remote sensing imagery to improve the monitoring, modeling and management of complex landscapes. In particular, we emphasize that remote sensing data are a particular case of the modifiable areal unit problem (MAUP) and describe how image-objects provide a way to reduce this problem. We then hypothesize that multiscale analysis should be guided by the intrinsic scale of the dominant landscape objects composing a scene and describe three different multiscale image-processing techniques with the potential to achieve this. Each of these techniques, i.e., Fractal Net Evolution Approach (FNEA), Linear Scale-Space and Blob-Feature Detection (SS), and Multiscale Object-Specific Analysis (MOSA), facilitates the multiscale pattern analysis, exploration and hierarchical linking of image-objects based on methods that derive spatially explicit multiscale contextual information from a single resolution of remote sensing imagery. We then outline the weaknesses and strengths of each technique and provide strategies for their improvement.  相似文献   
203.

Background

Forest fuel treatments have been proposed as tools to stabilize carbon stocks in fire-prone forests in the Western U.S.A. Although fuel treatments such as thinning and burning are known to immediately reduce forest carbon stocks, there are suggestions that these losses may be paid back over the long-term if treatments sufficiently reduce future wildfire severity, or prevent deforestation. Although fire severity and post-fire tree regeneration have been indicated as important influences on long-term carbon dynamics, it remains unclear how natural variability in these processes might affect the ability of fuel treatments to protect forest carbon resources. We surveyed a wildfire where fuel treatments were put in place before fire and estimated the short-term impact of treatment and wildfire on aboveground carbon stocks at our study site. We then used a common vegetation growth simulator in conjunction with sensitivity analysis techniques to assess how predicted timescales of carbon recovery after fire are sensitive to variation in rates of fire-related tree mortality, and post-fire tree regeneration.

Results

We found that fuel reduction treatments were successful at ameliorating fire severity at our study site by removing an estimated 36% of aboveground biomass. Treated and untreated stands stored similar amounts of carbon three years after wildfire, but differences in fire severity were such that untreated stands maintained only 7% of aboveground carbon as live trees, versus 51% in treated stands. Over the long-term, our simulations suggest that treated stands in our study area will recover baseline carbon storage 10?C35?years more quickly than untreated stands. Our sensitivity analysis found that rates of fire-related tree mortality strongly influence estimates of post-fire carbon recovery. Rates of regeneration were less influential on recovery timing, except when fire severity was high.

Conclusions

Our ability to predict the response of forest carbon resources to anthropogenic and natural disturbances requires models that incorporate uncertainty in processes important to long-term forest carbon dynamics. To the extent that fuel treatments are able to ameliorate tree mortality rates or prevent deforestation resulting from wildfire, our results suggest that treatments may be a viable strategy to stabilize existing forest carbon stocks.  相似文献   
204.
A new method is presented for the computation of the gravitational attraction of topographic masses when their height information is given on a regular grid. It is shown that the representation of the terrain relief by means of a bilinear surface not only offers a serious alternative to the polyhedra modeling, but also approaches even more smoothly the continuous reality. Inserting a bilinear approximation into the known scheme of deriving closed analytical expressions for the potential and its first-order derivatives for an arbitrarily shaped polyhedron leads to a one-dimensional integration with – apparently – no analytical solution. However, due to the high degree of smoothness of the integrand function, the numerical computation of this integral is very efficient. Numerical tests using synthetic data and a densely sampled digital terrain model in the Bavarian Alps prove that the new method is comparable to or even faster than a terrain modeling using polyhedra.  相似文献   
205.
 The traditional remove-restore technique for geoid computation suffers from two main drawbacks. The first is the assumption of an isostatic hypothesis to compute the compensation masses. The second is the double consideration of the effect of the topographic–isostatic masses within the data window through removing the reference field and the terrain reduction process. To overcome the first disadvantage, the seismic Moho depths, representing, more or less, the actual compensating masses, have been used with variable density anomalies computed by employing the topographic–isostatic mass balance principle. In order to avoid the double consideration of the effect of the topographic–isostatic masses within the data window, the effect of these masses for the used fixed data window, in terms of potential coefficients, has been subtracted from the reference field, yielding an adapted reference field. This adapted reference field has been used for the remove–restore technique. The necessary harmonic analysis of the topographic–isostatic potential using seismic Moho depths with variable density anomalies is given. A wide comparison among geoids computed by the adapted reference field with both the Airy–Heiskanen isostatic model and seismic Moho depths with variable density anomaly and a geoid computed by the traditional remove–restore technique is made. The results show that using seismic Moho depths with variable density anomaly along with the adapted reference field gives the best relative geoid accuracy compared to the GPS/levelling geoid. Received: 3 October 2001 / Accepted: 20 September 2002 Correspondence to: H.A. Abd-Elmotaal  相似文献   
206.
In this short contribution it is demonstrated how integer carrier phase cycle ambiguity resolution will perform in near future, when the US GPS gets modernized and the European Galileo becomes operational. The capability of ambiguity resolution is analyzed in the context of precise differential positioning over short, medium and long distances. Starting from dual-frequency operation with GPS at present, particularly augmenting the number of satellites turns out to have beneficial consequences on the capability of correctly resolving the ambiguities. With a 'double' constellation, on short baselines, the confidence of the integer ambiguity solution increases to a level of 0.99999999 or beyond. Electronic Publication  相似文献   
207.
 It is suggested that a spherical harmonic representation of the geoidal heights using global Earth gravity models (EGM) might be accurate enough for many applications, although we know that some short-wavelength signals are missing in a potential coefficient model. A `direct' method of geoidal height determination from a global Earth gravity model coefficient alone and an `indirect' approach of geoidal height determination through height anomaly computed from a global gravity model are investigated. In both methods, suitable correction terms are applied. The results of computations in two test areas show that the direct and indirect approaches of geoid height determination yield good agreement with the classical gravimetric geoidal heights which are determined from Stokes' formula. Surprisingly, the results of the indirect method of geoidal height determination yield better agreement with the global positioning system (GPS)-levelling derived geoid heights, which are used to demonstrate such improvements, than the results of gravimetric geoid heights at to the same GPS stations. It has been demonstrated that the application of correction terms in both methods improves the agreement of geoidal heights at GPS-levelling stations. It is also found that the correction terms in the direct method of geoidal height determination are mostly similar to the correction terms used for the indirect determination of geoidal heights from height anomalies. Received: 26 July 2001 / Accepted: 21 February 2002  相似文献   
208.
Omitted variables and measurement errors in explanatory variables frequently occur in hedonic price models. Ignoring these problems leads to biased estimators. In this paper, we develop a constrained autoregression–structural equation model (ASEM) to handle both types of problems. Standard panel data models to handle omitted variables bias are based on the assumption that the omitted variables are time-invariant. ASEM allows handling of both time-varying and time-invariant omitted variables by constrained autoregression. In the case of measurement error, standard approaches require additional external information which is usually difficult to obtain. ASEM exploits the fact that panel data are repeatedly measured which allows decomposing the variance of a variable into the true variance and the variance due to measurement error. We apply ASEM to estimate a hedonic housing model for urban Indonesia. To get insight into the consequences of measurement error and omitted variables, we compare the ASEM estimates with the outcomes of (1) a standard SEM, which does not account for omitted variables, (2) a constrained autoregression model, which does not account for measurement error, and (3) a fixed effects hedonic model, which ignores measurement error and time-varying omitted variables. The differences between the ASEM estimates and the outcomes of the three alternative approaches are substantial.  相似文献   
209.
Multi-technique space geodetic analysis software has been developed which allows to combine data on the observation level. In addition to local tie information, site-wise common parameters, i.e., troposphere and clocks, can be estimated with this software. Thus, it will be discussed how common parameters have to be estimated and where biases/offsets need to be taken into account. To test such a novel concept, Global Positioning System (GPS) and Very Long Baseline Interferometry (VLBI) data from the CONT11 campaign are being utilized. Since the VLBI baselines of this campaign extend over several thousands of kilometers, GPS data are processed in precise-point positioning mode and satellite orbits and clocks are kept fixed to the IGS final products. From the obtained results, it can be shown that the combination of space geodetic data on the observation level leads to a consistent improvement of station position repeatability as well as nuisance parameters like troposphere estimates. Furthermore, estimation of common parameters (troposphere or clocks) at co-located sites helps to improve the solution further and derive an utmost physically consistent model of the concerned parameters.  相似文献   
210.
Abstract

A procedure for continental‐scale mapping of burned boreal forest at 10‐day intervals was developed for application to coarse resolution satellite imagery. The basis of the technique is a multiple logistic regression model parameterized using 1998 SPOT‐4 VEGETATION clear‐sky composites and training sites selected across Canada. Predictor features consisted of multi‐temporal change metrics based on reflectance and two vegetation indices, which were normalized to the trajectory of background vegetation to account for phenological variation. Spatial‐contextual tests applied to the logistic model output were developed to remove noise and increase the sensitivity of detection. The procedure was applied over Canada for the 1998‐2000 fire seasons and validated using fire surveys and burned area statistics from forest fire management agencies. The area of falsely mapped burns was found to be small (3.5% commission error over Canada), and most burns larger than 10 km2 were accurately detected and mapped (R2 = 0.90, P<0.005, n = 91 for burns in two provinces). Canada‐wide satellite burned area was similar, but consistently smaller by comparison to statistics compiled by the Canadian Interagency Forest Fire Centre (by 17% in 1998, 16% in 1999, and 3% in 2000).  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号