首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   31113篇
  免费   465篇
  国内免费   382篇
测绘学   791篇
大气科学   2821篇
地球物理   6393篇
地质学   10808篇
海洋学   2398篇
天文学   6826篇
综合类   71篇
自然地理   1852篇
  2020年   179篇
  2019年   197篇
  2018年   493篇
  2017年   482篇
  2016年   691篇
  2015年   451篇
  2014年   678篇
  2013年   1410篇
  2012年   744篇
  2011年   1029篇
  2010年   878篇
  2009年   1251篇
  2008年   1065篇
  2007年   956篇
  2006年   1053篇
  2005年   883篇
  2004年   856篇
  2003年   878篇
  2002年   872篇
  2001年   749篇
  2000年   797篇
  1999年   663篇
  1998年   632篇
  1997年   671篇
  1996年   580篇
  1995年   542篇
  1994年   483篇
  1993年   433篇
  1992年   427篇
  1991年   417篇
  1990年   426篇
  1989年   400篇
  1988年   383篇
  1987年   468篇
  1986年   435篇
  1985年   465篇
  1984年   565篇
  1983年   563篇
  1982年   506篇
  1981年   493篇
  1980年   448篇
  1979年   434篇
  1978年   448篇
  1977年   394篇
  1976年   355篇
  1975年   355篇
  1974年   405篇
  1973年   391篇
  1972年   245篇
  1971年   224篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
91.
Multichannel high‐resolution seismic and multibeam data were acquired from the Maldives‐isolated carbonate platform in the Indian Ocean for a detailed characterization of the Neogene bank architecture of this edifice. The goal of the research is to decipher the controlling factors of platform evolution, with a special emphasis on sea‐level changes and changes of the oceanic currents. The stacking pattern of Lower to Middle Miocene depositional sequences, with an evolution of a ramp geometry to a flat‐topped platform, reflects variations of accommodation, which here are proposed to be primarily governed by fluctuations of relative sea level. Easterly currents during this stage of bank growth controlled an asymmetric east‐directed progradation of the bank edge. During the late middle Miocene, this system was replaced by a twofold configuration of bank development. Bank growth continued synchronously with partial bank demise and associated sediment‐drift deposition. This turnover is attributed to the onset and/or intensification of the Indian monsoon and related upwelling and occurrence of currents, locally changing environmental conditions and impinging upon the carbonate system. Mega spill over lobes, shaped by reversing currents, formed as large‐scale prograding complexes, which have previously been interpreted as deposits formed during a forced regression. On a regional scale, a complex carbonate‐platform growth can occur, with a coexistence of bank‐margin progradation and aggradation, as well as partial drowning. It is further shown that a downward shift of clinoforms and offlapping geometries in carbonate platforms are not necessarily indicative for a sea‐level driven forced regression. Findings are expected to be applicable to other examples of Cenozoic platforms in the Indo‐Pacific region.  相似文献   
92.
Two detailed geoids have been computed in the region of North Jutland. The first computation used marine data in the offshore areas. For the second computation the marine data set was replaced by the sparser airborne gravity data resulting from the AGMASCO campaign of September 1996. The results of comparisons of the geoid heights at on-shore geometric control showed that the geoid heights computed from the airborne gravity data matched in precision those computed using the marine data, supporting the view that airborne techniques have enormous potential for mapping those unsurveyed areas between the land-based data and the off-shore marine or altimetrically derived data. Received: 7 July 1997 / Accepted: 22 April 1998  相似文献   
93.
94.
Testing the accuracy of 3D modelling algorithms used for geological applications is extremely difficult as model results cannot be easily validated. This paper presents a new approach to evaluate the effectiveness of common interpolation algorithms used in 3D subsurface modelling, utilizing four synthetic grids to represent subsurface environments of varying geological complexity. The four grids are modelled with Inverse Distance Weighting and Ordinary Kriging, using data extracted from the synthetic grids in different spatial distribution patterns (regular, random, clustered and sparse), and with different numbers of data points (100, 256, 676 and 1,600). Utilizing synthetic grids for this evaluation allows quantitative statistical assessment of the accuracy of both interpolation algorithms in a variety of sampling conditions. Data distribution proved to be an important factor; as in many geological situations, relatively small numbers of randomly distributed data points can generate more accurate 3D models than larger amounts of clustered data. This study provides insight for optimizing the quantity and distribution of data required to accurately and cost-effectively interpolate subsurface units of varying complexity.  相似文献   
95.
Concrete probes in civil engineering material testing often show fissures or hairline-cracks. These cracks develop dynamically. Starting at a width of a few microns, they usually cannot be detected visually or in an image of a camera imaging the whole probe. Conventional image analysis techniques will detect fissures only if they show a width in the order of one pixel. To be able to detect and measure fissures with a width of a fraction of a pixel at an early stage of their development, a cascaded image analysis approach has been developed, implemented and tested. The basic idea of the approach is to detect discontinuities in dense surface deformation vector fields. These deformation vector fields between consecutive stereo image pairs, which are generated by cross correlation or least squares matching, show a precision in the order of 1/50 pixel. Hairline-cracks can be detected and measured by applying edge detection techniques such as a Sobel operator to the results of the image matching process. Cracks will show up as linear discontinuities in the deformation vector field and can be vectorized by edge chaining. In practical tests of the method, cracks with a width of 1/20 pixel could be detected, and their width could be determined at a precision of 1/50 pixel.  相似文献   
96.
Many regions around the world require improved gravimetric data bases to support very accurate geoid modeling for the modernization of height systems using GPS. We present a simple yet effective method to assess gravity data requirements, particularly the necessary resolution, for a desired precision in geoid computation. The approach is based on simulating high-resolution gravimetry using a topography-correlated model that is adjusted to be consistent with an existing network of gravity data. Analysis of these adjusted, simulated data through Stokes’s integral indicates where existing gravity data must be supplemented by new surveys in order to achieve an acceptable level of omission error in the geoid undulation. The simulated model can equally be used to analyze commission error, as well as model error and data inconsistencies to a limited extent. The proposed method is applied to South Korea and shows clearly where existing gravity data are too scarce for precise geoid computation.  相似文献   
97.
Recent tests on the geometric stability of several digital cameras that were not designed for photogrammetric applications have shown that the accomplished accuracies in object space are either limited or that the accuracy potential is not exploited to the fullest extent. A total of 72 calibrations were calculated with four different software products for eleven digital camera models with different hardware setups, some with mechanical fixation of one or more parts. The calibration procedure was chosen in accord to a German guideline for evaluation of optical 3D measuring systems [VDI/VDE, VDI/VDE 2634 Part 1, 2002. Optical 3D Measuring Systems–Imaging Systems with Point-by-point Probing. Beuth Verlag, Berlin]. All images were taken with ringflashes which was considered a standard method for close-range photogrammetry. In cases where the flash was mounted to the lens, the force exerted on the lens tube and the camera mount greatly reduced the accomplished accuracy. Mounting the ringflash to the camera instead resulted in a large improvement of accuracy in object space. For standard calibration best accuracies in object space were accomplished with a Canon EOS 5D and a 35 mm Canon lens where the focusing tube was fixed with epoxy (47  μm maximum absolute length measurement error in object space). The fixation of the Canon lens was fairly easy and inexpensive resulting in a sevenfold increase in accuracy compared with the same lens type without modification. A similar accuracy was accomplished with a Nikon D3 when mounting the ringflash to the camera instead of the lens (52  μm maximum absolute length measurement error in object space). Parameterisation of geometric instabilities by introduction of an image variant interior orientation in the calibration process improved results for most cameras. In this case, a modified Alpa 12 WA yielded the best results (29  μm maximum absolute length measurement error in object space). Extending the parameter model with FiBun software to model not only an image variant interior orientation, but also deformations in the sensor domain of the cameras, showed significant improvements only for a small group of cameras. The Nikon D3 camera yielded the best overall accuracy (25  μm maximum absolute length measurement error in object space) with this calibration procedure indicating at the same time the presence of image invariant error in the sensor domain. Overall, calibration results showed that digital cameras can be applied for an accurate photogrammetric survey and that only a little effort was sufficient to greatly improve the accuracy potential of digital cameras.  相似文献   
98.

Background

Forest fuel treatments have been proposed as tools to stabilize carbon stocks in fire-prone forests in the Western U.S.A. Although fuel treatments such as thinning and burning are known to immediately reduce forest carbon stocks, there are suggestions that these losses may be paid back over the long-term if treatments sufficiently reduce future wildfire severity, or prevent deforestation. Although fire severity and post-fire tree regeneration have been indicated as important influences on long-term carbon dynamics, it remains unclear how natural variability in these processes might affect the ability of fuel treatments to protect forest carbon resources. We surveyed a wildfire where fuel treatments were put in place before fire and estimated the short-term impact of treatment and wildfire on aboveground carbon stocks at our study site. We then used a common vegetation growth simulator in conjunction with sensitivity analysis techniques to assess how predicted timescales of carbon recovery after fire are sensitive to variation in rates of fire-related tree mortality, and post-fire tree regeneration.

Results

We found that fuel reduction treatments were successful at ameliorating fire severity at our study site by removing an estimated 36% of aboveground biomass. Treated and untreated stands stored similar amounts of carbon three years after wildfire, but differences in fire severity were such that untreated stands maintained only 7% of aboveground carbon as live trees, versus 51% in treated stands. Over the long-term, our simulations suggest that treated stands in our study area will recover baseline carbon storage 10?C35?years more quickly than untreated stands. Our sensitivity analysis found that rates of fire-related tree mortality strongly influence estimates of post-fire carbon recovery. Rates of regeneration were less influential on recovery timing, except when fire severity was high.

Conclusions

Our ability to predict the response of forest carbon resources to anthropogenic and natural disturbances requires models that incorporate uncertainty in processes important to long-term forest carbon dynamics. To the extent that fuel treatments are able to ameliorate tree mortality rates or prevent deforestation resulting from wildfire, our results suggest that treatments may be a viable strategy to stabilize existing forest carbon stocks.  相似文献   
99.
A new method is presented for the computation of the gravitational attraction of topographic masses when their height information is given on a regular grid. It is shown that the representation of the terrain relief by means of a bilinear surface not only offers a serious alternative to the polyhedra modeling, but also approaches even more smoothly the continuous reality. Inserting a bilinear approximation into the known scheme of deriving closed analytical expressions for the potential and its first-order derivatives for an arbitrarily shaped polyhedron leads to a one-dimensional integration with – apparently – no analytical solution. However, due to the high degree of smoothness of the integrand function, the numerical computation of this integral is very efficient. Numerical tests using synthetic data and a densely sampled digital terrain model in the Bavarian Alps prove that the new method is comparable to or even faster than a terrain modeling using polyhedra.  相似文献   
100.
Omitted variables and measurement errors in explanatory variables frequently occur in hedonic price models. Ignoring these problems leads to biased estimators. In this paper, we develop a constrained autoregression–structural equation model (ASEM) to handle both types of problems. Standard panel data models to handle omitted variables bias are based on the assumption that the omitted variables are time-invariant. ASEM allows handling of both time-varying and time-invariant omitted variables by constrained autoregression. In the case of measurement error, standard approaches require additional external information which is usually difficult to obtain. ASEM exploits the fact that panel data are repeatedly measured which allows decomposing the variance of a variable into the true variance and the variance due to measurement error. We apply ASEM to estimate a hedonic housing model for urban Indonesia. To get insight into the consequences of measurement error and omitted variables, we compare the ASEM estimates with the outcomes of (1) a standard SEM, which does not account for omitted variables, (2) a constrained autoregression model, which does not account for measurement error, and (3) a fixed effects hedonic model, which ignores measurement error and time-varying omitted variables. The differences between the ASEM estimates and the outcomes of the three alternative approaches are substantial.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号