全文获取类型
收费全文 | 1221篇 |
免费 | 52篇 |
国内免费 | 12篇 |
专业分类
测绘学 | 25篇 |
大气科学 | 90篇 |
地球物理 | 293篇 |
地质学 | 475篇 |
海洋学 | 118篇 |
天文学 | 212篇 |
综合类 | 2篇 |
自然地理 | 70篇 |
出版年
2023年 | 8篇 |
2021年 | 15篇 |
2020年 | 16篇 |
2019年 | 33篇 |
2018年 | 34篇 |
2017年 | 34篇 |
2016年 | 37篇 |
2015年 | 45篇 |
2014年 | 54篇 |
2013年 | 81篇 |
2012年 | 50篇 |
2011年 | 70篇 |
2010年 | 52篇 |
2009年 | 84篇 |
2008年 | 58篇 |
2007年 | 53篇 |
2006年 | 56篇 |
2005年 | 52篇 |
2004年 | 54篇 |
2003年 | 40篇 |
2002年 | 33篇 |
2001年 | 24篇 |
2000年 | 14篇 |
1999年 | 16篇 |
1998年 | 12篇 |
1997年 | 15篇 |
1996年 | 14篇 |
1995年 | 15篇 |
1994年 | 7篇 |
1993年 | 12篇 |
1992年 | 13篇 |
1991年 | 5篇 |
1990年 | 4篇 |
1989年 | 14篇 |
1988年 | 6篇 |
1987年 | 11篇 |
1985年 | 8篇 |
1984年 | 6篇 |
1983年 | 17篇 |
1982年 | 10篇 |
1981年 | 14篇 |
1980年 | 6篇 |
1979年 | 7篇 |
1978年 | 12篇 |
1977年 | 5篇 |
1976年 | 7篇 |
1975年 | 7篇 |
1974年 | 7篇 |
1969年 | 6篇 |
1968年 | 3篇 |
排序方式: 共有1285条查询结果,搜索用时 15 毫秒
131.
The LMDZ4 general circulation model: climate performance and sensitivity to parametrized physics with emphasis on tropical convection 总被引:3,自引:11,他引:3
Frédéric Hourdin Ionela Musat Sandrine Bony Pascale Braconnot Francis Codron Jean-Louis Dufresne Laurent Fairhead Marie-Angèle Filiberti Pierre Friedlingstein Jean-Yves Grandpeix Gerhard Krinner Phu LeVan Zhao-Xin Li François Lott 《Climate Dynamics》2006,27(7-8):787-813
The LMDZ4 general circulation model is the atmospheric component of the IPSL–CM4 coupled model which has been used to perform climate change simulations for the 4th IPCC assessment report. The main aspects of the model climatology (forced by observed sea surface temperature) are documented here, as well as the major improvements with respect to the previous versions, which mainly come form the parametrization of tropical convection. A methodology is proposed to help analyse the sensitivity of the tropical Hadley–Walker circulation to the parametrization of cumulus convection and clouds. The tropical circulation is characterized using scalar potentials associated with the horizontal wind and horizontal transport of geopotential (the Laplacian of which is proportional to the total vertical momentum in the atmospheric column). The effect of parametrized physics is analysed in a regime sorted framework using the vertical velocity at 500 hPa as a proxy for large scale vertical motion. Compared to Tiedtke’s convection scheme, used in previous versions, the Emanuel’s scheme improves the representation of the Hadley–Walker circulation, with a relatively stronger and deeper large scale vertical ascent over tropical continents, and suppresses the marked patterns of concentrated rainfall over oceans. Thanks to the regime sorted analyses, these differences are attributed to intrinsic differences in the vertical distribution of convective heating, and to the lack of self-inhibition by precipitating downdraughts in Tiedtke’s parametrization. Both the convection and cloud schemes are shown to control the relative importance of large scale convection over land and ocean, an important point for the behaviour of the coupled model. 相似文献
132.
133.
134.
Jean Pierre Ometto Ana Paula Aguiar Talita Assis Luciana Soler Pedro Valle Graciela Tejada David M. Lapola Patrick Meir 《Climatic change》2014,124(3):545-560
As land use change (LUC), including deforestation, is a patchy process, estimating the impact of LUC on carbon emissions requires spatially accurate underlying data on biomass distribution and change. The methods currently adopted to estimate the spatial variation of above- and below-ground biomass in tropical forests, in particular the Brazilian Amazon, are usually based on remote sensing analyses coupled with field datasets, which tend to be relatively scarce and often limited in their spatial distribution. There are notable differences among the resulting biomass maps found in the literature. These differences subsequently result in relatively high uncertainties in the carbon emissions calculated from land use change, and have a larger impact when biomass maps are coded into biomass classes referring to specific ranges of biomass values. In this paper we analyze the differences among recently-published biomass maps of the Amazon region, including the official information used by the Brazilian government for its communication to the United Nation Framework on Climate Change Convention of the United Nations. The estimated average pre-deforestation biomass in the four maps, for the areas of the Amazon region that had been deforested during the 1990–2009 period, varied from 205?±?32 Mg ha?1 during 1990–1999, to 216?±?31 Mg ha?1 during 2000–2009. The biomass values of the deforested areas in 2011 were between 7 and 24 % higher than for the average deforested areas during 1990–1999, suggesting that although there was variation in the mean value, deforestation was tending to occur in increasingly carbon-dense areas, with consequences for carbon emissions. To summarize, our key findings were: (i) the current maps of Amazonian biomass show substantial variation in both total biomass and its spatial distribution; (ii) carbon emissions estimates from deforestation are highly dependent on the spatial distribution of biomass as determined by any single biomass map, and on the deforestation process itself; (iii) future deforestation in the Brazilian Amazon is likely to affect forests with higher biomass than those deforested in the past, resulting in smaller reductions in carbon dioxide emissions than expected purely from the recent reductions in deforestation rates; and (iv) the current official estimate of carbon emissions from Amazonian deforestation is probably overestimated, because the recent loss of higher-biomass forests has not been taken into account. 相似文献
135.
Ground pressure observations made at Macao (22○N, 113○E) from 1953 to 1991 are analyzed and compared with the stratospheric quasi-biennial oscillation (QBO) data obtained during the same interval. The periods of the two phenomena and their time evolution are found to be close to each other. Furthermore, the time series of the stratospheric winds and the S2(p) QBO signature are highly correlated, thus confirming earlier analysis. On this basis, pressure measurements obtained at Batavia (now Djakarta: 6○S, 107○E) from 1870 to 1944 are used to trace back the QBO phenomenon before the advent of systematic stratospheric balloon measurements. The inferred period, which varies between 25 and 32 months, suggests that the QBO has been present in the atmosphere at least since 1870. 相似文献
136.
The application of trichloroacetic acid (TCA) as a shell extractant for preparation of soft body parts with reference to tissue metal concentrations (Cd, Cu, Pb, Zn) in shellfish has been evaluated on the example of the mud snail Hydrobia ulvae, a small marine prosobranch densely present in rocky and soft-bottom habitats of the eastern Atlantic. A solution of 0.1 M TCA was tested on individuals treated according to two different protocols: (1) thawed after freezing ("non-dried") and (2) thawed and air-dried to a constant weight ("dried"). Two points were investigated in detail to improve the method: individual soft tissue dry weight and tissue metal concentration following a standard digestion method. In both instances, the results were compared with those from manually dissected snails. Conditions for total shell decalcification of 60 individuals (3-4 mm long) were 5.5 h in 20 ml of 0.1 M TCA.No differences in individual soft tissue weight were observed between the treatments, indicating good efficiency of the TCA extraction with respect to weight of soft body parts. In contrast, tissue metal concentrations varied among treatments. The TCA extraction of the dried animals had a good recovery for Cd, most likely due to the lower solubility of Cd vital cellular components (proteins and mineral concretions) from the dried tissue. Satisfactory recoveries of the tissue concentrations of Cu and Pb were obtained for the non-dried individuals. This might be related to the specific distribution of metals in the organism (namely in the digestive glands and gonads) and their different chemical reactivity with TCA after the tissue was dried. Limited susceptibility of Zn-bearing protein bindings to complexing with TCA also accounts for significantly lower concentrations of Zn in the mud snail's soft tissue that was extracted. The 0.1 M TCA solution is therefore recommended for extraction of the shells of Hydrobia ulvae for tissue determination of Cd, Cu and Pb; however the treatment protocol does affect metal recovery and thus a consistent procedure should be followed.The extracted metals from the soft tissues and shells of the mud snails (on the basis of both metal concentrations and contents) were ranked in order of increasing contribution of soft body parts to the total (shell+tissue): Pb相似文献
137.
Ivane Lilian Pairaud Nathaniel Bensoussan Pierre Garreau Vincent Faure Joaquim Garrabou 《Ocean Dynamics》2014,64(1):103-115
In the framework of climate change, the increase in ocean heat wave frequency is expected to impact marine life. Large-scale positive temperature anomalies already occurred in the northwestern Mediterranean Sea in 1999, 2003 and 2006. These anomalies were associated with mass mortality events of macrobenthic species in coastal areas (0–40 m in depth). The anomalies were particularly severe in 1999 and 2003 when thousands of kilometres of coasts and about 30 species were affected. The aim of this study was to develop a methodology to assess the current risk of mass mortality associated with temperature increase along NW Mediterranean continental coasts. A 3D regional ocean model was used to obtain the temperature conditions for the period 2001–2010, for which the model outputs were validated by comparing them with in situ observations in affected areas. The model was globally satisfactory, although extremes were underestimated and required correction. Combined with information on the thermo-tolerance of a key species (the red gorgonian P. clavata) as well as its spatial distribution, the modelled temperature conditions were then used to assess the risk of mass mortality associated with thermal stress for the first time. Most of the known areas of observed mass mortality were found using the model, although the degree of risk in certain areas was underestimated. Using climatic IPCC scenarios, the methodology could be applied to explore the impacts of expected climate change in the NW Mediterranean. This is a key issue for the development of sound management and conservation plans to protect Mediterranean marine biodiversity in the face of climate change. 相似文献
138.
Modelling of local velocity anomalies: a cookbook 总被引:1,自引:0,他引:1
The determination of small-scale velocity anomalies (from tens to a few hundreds of metres) is a major problem in seismic exploration. The impact of such anomalies on a structural interpretation can be dramatic and conventional techniques such as tomographic inversion or migration velocity analysis are powerless to resolve the ambiguity between structural and velocity origins of anomalies. We propose an alternative approach based on stochastic modelling of numerous anomalies until a set of models is found which can explain the real data. This technique attempts to include as much a priori geological information as possible. It aims at providing the interpreter with a set of velocity anomalies which could possibly be responsible for the structural response. The interpreter can then choose one or several preferred models and pursue a more sophisticated analysis. The class of retained models are all equivalent in terms of data and therefore represent the uncertainty in the model space. The procedure emulates the real processing sequence using a simplified scheme. Essentially, the technique consists of five steps: 1 Interpretation of a structural anomaly in terms of a velocity anomaly with its possible variations in terms of position, size and amplitude. 2 Drawing a model by choosing the parameters of the anomaly within the acceptable range. 3 Modelling the traveltimes in this model and producing the imaging of the reflected interface. 4 Comparing the synthetic data with the real data and keeping the model if it lies within the data uncertainty range. 5 Iterate from step 2. In order to avoid the high computational cost inherent in using statistical determinations, simplistic assumptions have been made: ? The anomaly is embedded in a homogeneous medium: we assume that the refraction and the time shift due to the anomaly have a first-order effect compared with ray bending in the intermediate layers. ? We model only the zero-offset rays and therefore we restrict ourselves to structural problems. ? We simulate time migration and so address only models of limited structural complexity. These approximations are justified in a synthetic model which includes strong lateral velocity variations, by comparing the result of a full processing sequence (prestack modelling, stack and depth migration) with the simplified processing. This model is then used in a blind test on the inversion scheme. 相似文献
139.
In this paper, we present a case study on the use of the normalized source strength (NSS) for interpretation of magnetic and gravity gradient tensors data. This application arises in exploration of nickel, copper and platinum group element (Ni‐Cu‐PGE) deposits in the McFaulds Lake area, Northern Ontario, Canada. In this study, we have used the normalized source strength function derived from recent high resolution aeromagnetic and gravity gradiometry data for locating geological bodies. In our algorithm, we use maxima of the normalized source strength for estimating the horizontal location of the causative body. Then we estimate depth to the source and structural index at that point using the ratio between the normalized source strength and its vertical derivative calculated at two levels; the measurement level and a height h above the measurement level. To discriminate more reliable solutions from spurious ones, we reject solutions with unreasonable estimated structural indices. This method uses an upward continuation filter which reduces the effect of high frequency noise. In the magnetic case, the advantage is that, in general, the normalized magnetic source strength is relatively insensitive to magnetization direction, thus it provides more reliable information than standard techniques when geologic bodies carry remanent magnetization. For dipping gravity sources, the calculated normalized source strength yields a reliable estimate of the source location by peaking right above the top surface. Application of the method on aeromagnetic and gravity gradient tensor data sets from McFaulds Lake area indicates that most of the gravity and magnetic sources are located just beneath a 20 m thick (on average) overburden and delineated magnetic and gravity sources which can be probably approximated by geological contacts and thin dikes, come up to the overburden. 相似文献
140.
Pierre Javelle Julie Demargne Dimitri Defrance Jean Pansu Patrick Arnaud 《水文科学杂志》2014,59(7):1390-1402
AbstractThis article presents a comparison between real-time discharges calculated by a flash-flood warning system and post-event flood peak estimates. The studied event occurred on 15 and 16 June 2010 at the Argens catchment located in the south of France. Real-time flood warnings were provided by the AIGA (Adaptation d’Information Géographique pour l’Alerte en Crue) warning system, which is based on a simple distributed hydrological model run at a 1-km2 resolution using radar rainfall information. The timing of the warnings (updated every 15 min) was compared to the observed flood impacts. Furthermore, “consolidated” flood peaks estimated by an intensive post-event survey were used to evaluate the AIGA-estimated peak discharges. The results indicated that the AIGA warnings clearly identified the most affected areas. However, the effective lead-time of the event detection was short, especially for fast-response catchments, because the current method does not take into account any rainfall forecast. The flood peak analysis showed a relatively good correspondence between AIGA- and field-estimated peak values, although some differences were due to the rainfall underestimation by the radar and rainfall–runoff model limitations.
Editor Z.W. Kundzewicz; Guest editor R.J. MooreCitation Javelle, P., Demargne, J., Defrance, D., Pansu, J. and Arnaud, P., 2014. Evaluating flash-flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system. Hydrological Sciences Journal, 59 (7), 1390–1402. http://dx.doi.org/10.1080/02626667.2014.923970 相似文献