首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   491篇
  免费   15篇
  国内免费   2篇
测绘学   10篇
大气科学   25篇
地球物理   112篇
地质学   179篇
海洋学   36篇
天文学   95篇
自然地理   51篇
  2021年   7篇
  2020年   14篇
  2019年   7篇
  2018年   7篇
  2017年   14篇
  2016年   13篇
  2015年   5篇
  2014年   15篇
  2013年   30篇
  2012年   17篇
  2011年   14篇
  2010年   17篇
  2009年   25篇
  2008年   16篇
  2007年   13篇
  2006年   21篇
  2005年   11篇
  2004年   11篇
  2003年   13篇
  2002年   18篇
  2001年   7篇
  2000年   6篇
  1999年   6篇
  1998年   6篇
  1997年   7篇
  1996年   8篇
  1995年   5篇
  1992年   8篇
  1988年   4篇
  1985年   4篇
  1984年   9篇
  1983年   6篇
  1982年   3篇
  1978年   3篇
  1977年   3篇
  1976年   6篇
  1975年   9篇
  1974年   6篇
  1973年   6篇
  1970年   4篇
  1962年   4篇
  1961年   3篇
  1960年   6篇
  1959年   4篇
  1956年   3篇
  1955年   4篇
  1952年   3篇
  1940年   3篇
  1937年   4篇
  1936年   3篇
排序方式: 共有508条查询结果,搜索用时 15 毫秒
11.
Gas streaming through the solar system experiences both destructive and scattering processes, the latter primarily in collisional interactions with the solar wind protons. The scattering interactions can be important in filling the downstream wake. They may effectively increase the velocity dispersion and also cause discrete orbit changes.The downstream intensity moment is here evaluated analytically for particles suffering a single, discrete collision, and compared with the moment from a thermal velocity dispersion (both in the absence of a central force field). The elastic scattering collisions of protons in H-gas lead to a contribution to theL backscatter from the wake equivalent to an initial thermal velocity of about 1 km s–1, giving an intensity for cool gas of the order of 10R. This exceeds the contribution due to focussing in the solar gravitational field if the radiation pressure is not less than 0.8 of the gravitational attraction.  相似文献   
12.
Combining a geological model with a geomechanical model, it generally turns out that the geomechanical model is built from units that are at least a 100 times larger in volume than the units of the geological model. To counter this mismatch in scales, the geological data model's heterogeneous fine-scale Young's moduli and Poisson's ratios have to be “upscaled” to one “equivalent homogeneous” coarse-scale rigidity. This coarse-scale rigidity relates the volume-averaged displacement, strain, stress, and energy to each other, in such a way that the equilibrium equation, Hooke's law, and the energy equation preserve their fine-scale form on the coarse scale. Under the simplifying assumption of spatial periodicity of the heterogeneous fine-scale rigidity, homogenization theory can be applied. However, even then the spatial variability is generally so complex that exact solutions cannot be found. Therefore, numerical approximation methods have to be applied. Here the node-based finite element method for the displacement as primary variable has been used. Three numerical examples showing the upper bound character of this finite element method are presented.  相似文献   
13.
We examined the hypothesis that minima in local recurrence time, TL, or equivalently maxima in local probability, PL, may map asperities in the Kanto and Tokai areas of Japan, where the earthquake catalog of the National Research Institute for Earth Science and Disaster Prevention (NIED) is complete at the M=1.5 (M1.5) level. We mapped TL (PL) based on the a- and b-values of the nearest earthquakes within 20 km of every node of a grid spaced 0.01° for M7 target events. Only earthquakes within the top 33 km were used. The b-values increase strongly with depth, in several areas. Therefore, some of the TL (PL) anomalies are not revealed if data from the entire crustal seismogenic zone are mixed. Thus, we mapped TL (PL) separately for the top 15 km and the rest of the depth range, as well as for the entire seismogenic crust. The resulting TL- and PL-maps show that approximately 12% of the total area shows anomalously short recurrence times. Out of six shallow target events with M≥6.5 and which occurred since 1890, five are located within the anomalous areas with TL <450 years. We interpret this to mean that areas with anomalously short TL map asperities, which are more likely than other areas to generate future target events. The probability that this result is due to chance is vanishingly small. The great Kanto rupture of 1923 appears to have initiated in the most significant asperity we mapped in the study area. One anomaly is located in the northeastern part of the area of the proposed future rupture of the Tokai earthquake, and another one at its southwestern corner. The absolute values of TL calculated are uncertain because they depend on the size of the volume used for the calculation.  相似文献   
14.
A simple grid cell‐based distributed hydrologic model was developed to provide spatial information on hydrologic components for determining hydrologically based critical source areas. The model represents the critical process (soil moisture variation) to run‐off generation accounting for both local and global water balance. In this way, it simulates both infiltration excess run‐off and saturation excess run‐off. The model was tested by multisite and multivariable evaluation on the 50‐km2 Little River Experimental Watershed I in Georgia, U.S. and 2 smaller nested subwatersheds. Water balance, hydrograph, and soil moisture were simulated and compared to observed data. For streamflow calibration, the daily Nash‐Sutcliffe coefficient was 0.78 at the watershed outlet and 0.56 and 0.75 at the 2 nested subwatersheds. For the validation period, the Nash‐Sutcliffe coefficients were 0.79 at the watershed outlet and 0.85 and 0.83 at the 2 subwatersheds. The per cent bias was less than 15% for all sites. For soil moisture, the model also predicted the rising and declining trends at 4 of the 5 measurement sites. The spatial distribution of surface run‐off simulated by the model was mainly controlled by local characteristics (precipitation, soil properties, and land cover) on dry days and by global watershed characteristics (relative position within the watershed and hydrologic connectivity) on wet days when saturation excess run‐off was simulated. The spatial details of run‐off generation and travel time along flow paths provided by the model are helpful for watershed managers to further identify critical source areas of non‐point source pollution and develop best management practices.  相似文献   
15.
In this paper, we addressed a sensitivity analysis of the snow module of the GEOtop2.0 model at point and catchment scale in a small high‐elevation catchment in the Eastern Italian Alps (catchment size: 61 km2). Simulated snow depth and snow water equivalent at the point scale were compared with measured data at four locations from 2009 to 2013. At the catchment scale, simulated snow‐covered area (SCA) was compared with binary snow cover maps derived from moderate‐resolution imaging spectroradiometer (MODIS) and Landsat satellite imagery. Sensitivity analyses were used to assess the effect of different model parameterizations on model performance at both scales and the effect of different thresholds of simulated snow depth on the agreement with MODIS data. Our results at point scale indicated that modifying only the “snow correction factor” resulted in substantial improvements of the snow model and effectively compensated inaccurate winter precipitation by enhancing snow accumulation. SCA inaccuracies at catchment scale during accumulation and melt period were affected little by different snow depth thresholds when using calibrated winter precipitation from point scale. However, inaccuracies were strongly controlled by topographic characteristics and model parameterizations driving snow albedo (“snow ageing coefficient” and “extinction of snow albedo”) during accumulation and melt period. Although highest accuracies (overall accuracy = 1 in 86% of the catchment area) were observed during winter, lower accuracies (overall accuracy < 0.7) occurred during the early accumulation and melt period (in 29% and 23%, respectively), mostly present in areas with grassland and forest, slopes of 20–40°, areas exposed NW or areas with a topographic roughness index of ?0.25 to 0 m. These findings may give recommendations for defining more effective model parameterization strategies and guide future work, in which simulated and MODIS SCA may be combined to generate improved products for SCA monitoring in Alpine catchments.  相似文献   
16.
17.
18.
This article provides an analysis of the EU Emissions Trading Scheme (ETS) and the harmonized benchmark-based allocation procedures by comparing two energy-intensive sectors with activities in three Member States. These sectors include the cement industry (CEI) and the pulp and paper industry (PPI) in the UK, Sweden, and France. Our results show that the new procedures are better suited for the more homogeneous CEI, in which the outcome of stricter allocation of emissions allowances is consistent between Member States. For the more heterogeneous PPI – in terms of its product portfolios, technical infrastructures, and fuel mixes – the allocation procedures lead to diverse outcomes. It is the lack of product benchmark curves, and the alternative use of benchmark values that are biased towards a fossil fuel-mix and are based on specific energy use rather than emission intensity, which leads to allocations to the PPI that do not represent the average performance of the top 10% of GHG-efficient installations. Another matter is that grandfathering is still present via the historically based production volumes. How to deal with structural change and provisions regarding capacity reductions and partial cessation is an issue that is highly relevant for the PPI but less so for the CEI.

Policy relevance

After an unprecedented amount of consultation with industrial associations and other stakeholders, a harmonized benchmark-based allocation methodology was introduced in the third trading period of the EU ETS. Establishing a reliable and robust benchmark methodology for free allocation that shields against high direct carbon costs, is perceived as fair and politically acceptable, and still incentivizes firms to take action, is a significant challenge. This article contributes to a deeper understanding of the challenges in effectively applying harmonized rules in industrial sectors that are heterogeneous. This is essential for the debate on structural reformation of the EU ETS, and for sharing experiences with other emerging emissions trading systems in the world that also consider benchmark methodologies.  相似文献   

19.
Ecological-niche factor analysis (ENFA) was applied to the reef framework-forming cold-water coral Lophelia pertusa. The environmental tolerances of this species were assessed using readily available oceanographic data, including physical, chemical, and biological variables. L. pertusa was found at mean depths of 468 and 480 m on the regional and global scales and occupied a niche that included higher than average current speed and productivity, supporting the theory that their limited food supply is locally enhanced by currents. Most records occurred in areas with a salinity of 35, mean temperatures of 6.2–6.7  °C and dissolved oxygen levels of 6.0–6.2 ml l−1. The majority of records were found in areas that were saturated with aragonite but had low concentration of nutrients (silicate, phosphate, and nitrate). Suitable habitat for L. pertusa was predicted using ENFA on a global and a regional scale that incorporated the north-east Atlantic Ocean. Regional prediction was reliable due to numerous presence points throughout the area, whereas global prediction was less reliable due to the paucity of presence data outside of the north-east Atlantic. However, the species niche was supported at each spatial scale. Predicted maps at the global scale reinforced the general consensus that the North Atlantic Ocean is a key region in the worldwide distribution of L. pertusa. Predictive modelling is an approach that can be applied to cold-water coral species to locate areas of suitable habitat for further study. It may also prove a useful tool to assist spatial planning of offshore marine protected areas. However, issues with eco-geographical datasets, including their coarse resolution and limited geographical coverage, currently restrict the scope of this approach.  相似文献   
20.
Although meteorites are now considered as scientific objects, they still bear a strong and powerful symbolic meaning due to their extraterrestrial provenance. The present article focuses on their legal status, in other words the collection of rules, very diverse in nature, which are applicable to them. Despite a growing international market, the question of meteorites is often ignored or regarded as a detail in international relations and is rarely taken explicitly into account in negotiations and treaties. This relative neglect explains why a non‐State player, the Meteoritical Society, has taken methodological initiatives into meteoritic science and has effectively become a regulator of meteorite naming and acceptance, with a global scope. We show that to understand the legal status of meteorites, it is necessary to consider them under the prism of public international law, transnational law, and national law. We conclude that, despite the universality of meteorites as extraterrestrial objects, the variability of legal rules applicable to meteorites depending onto which territory they fall or where they are found. We note, however, that there is a trend toward regulatory uniformity in the scientific analysis of meteorites, which frames the practices of researchers and regulates traders’ activities. Finally, we contend that a meteorite remains a badly defined legal object, because it can be viewed under many angles: as an object susceptible to private appropriation, as a “common thing” (res communis), or as an element of national heritage.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号