全文获取类型
收费全文 | 182篇 |
免费 | 19篇 |
专业分类
测绘学 | 8篇 |
大气科学 | 17篇 |
地球物理 | 58篇 |
地质学 | 68篇 |
海洋学 | 11篇 |
天文学 | 28篇 |
综合类 | 4篇 |
自然地理 | 7篇 |
出版年
2023年 | 1篇 |
2022年 | 1篇 |
2021年 | 6篇 |
2020年 | 8篇 |
2019年 | 3篇 |
2018年 | 14篇 |
2017年 | 16篇 |
2016年 | 14篇 |
2015年 | 19篇 |
2014年 | 13篇 |
2013年 | 15篇 |
2012年 | 3篇 |
2011年 | 13篇 |
2010年 | 10篇 |
2009年 | 16篇 |
2008年 | 8篇 |
2007年 | 7篇 |
2006年 | 7篇 |
2005年 | 4篇 |
2004年 | 2篇 |
2003年 | 3篇 |
2002年 | 3篇 |
2001年 | 3篇 |
2000年 | 1篇 |
1999年 | 3篇 |
1998年 | 3篇 |
1996年 | 1篇 |
1995年 | 2篇 |
1994年 | 1篇 |
1990年 | 1篇 |
排序方式: 共有201条查询结果,搜索用时 15 毫秒
11.
Jesús Gallego 《Astrophysics and Space Science》1998,263(1-4):1-14
The evolution of the Star Formation Rate (SFR) density of the Universe as a function of look-back time is a fundamental parameter
in order to understand the formation and evolution of galaxies. The current picture, only outlined in the last years, is that
the global SFR density has dropped by about an order of magnitude from a redshift of z∼1.5 to the current value at z=0. Because
these SFR density studies are now extended to the whole range in redshift, it becomes mandatory to combine data from different
SFR tracers. At low redshifts, optical emission lines are the most widely used. Using Hα as current-SFR tracer, the Universidad
Complutense de Madrid (UCM) Survey provided the first estimation of the global SFR density in the Local Universe. The Hα flux
in emission is directly related to the number of ionizing photons and, modulo IMF, to the total mass of stars formed. Metallic
lines like [OII]λ3727 and [OIII]λ5007 are affected by metallicity and excitation. Beyond redshifts z∼0.4, Hα is not observable
in the optical and [OII]λ3727 or UV luminosities have to be used. The UCM galaxy sample has been used to obtain a calibration
between [OII]λ3727 luminosity and SFR specially suitable for the different types of star-forming galaxies found by deep spectroscopic
surveys in redshifts up to z∼1.5. These calibrations, when applied to recent deep redshift surveys confirm the drop of the
SFR density of the Universe since z∼1 previously infered in the UV. However, the fundamental parameter that determines galactic
evolution is mass, not luminosity. The mass function for local star-forming galaxies is critical for any future comparison
with other galaxy populations of different evolutionary status. Hα velocity-widths for UCM galaxies indicate that besides
a small fraction of 1010-1011 M⊙ starburst nuclei spirals, the majority have dynamical masses in the ∼109 M⊙ range. A comparison with published data for faint blue galaxies suggests that star-forming galaxies at z∼1 would have SFR
per unit mass and burst strengths similar to those at z=0, but being intrinsically more massive.
This revised version was published online in July 2006 with corrections to the Cover Date. 相似文献
12.
André Deprit Jesúus Palacián Etienne Deprit 《Celestial Mechanics and Dynamical Astronomy》2001,79(3):157-182
The relegation algorithm extends the method of normalization by Lie transformations. Given a Hamiltonian that is a power series = 0+ 1+ ... of a small parameter , normalization constructs a map which converts the principal part 0into an integral of the transformed system — relegation does the same for an arbitrary function [G]. If the Lie derivative induced by [G] is semi-simple, a double recursion produces the generator of the relegating transformation. The relegation algorithm is illustrated with an elementary example borrowed from galactic dynamics; the exercise serves as a standard against which to test software implementations. Relegation is also applied to the more substantial example of a Keplerian system perturbed by radiation pressure emanating from a rotating source.This revised version was published online in October 2005 with corrections to the Cover Date. 相似文献
13.
Juan I. López-Moreno Jesús Revuelto E. Alonso-González Alba Sanmiguel-Vallelado Steven R. Fassnacht Jeffrey Deems Enrique Morán-Tejeda 《山地科学学报》2017,14(5):823-842
This study demonstrated the usefulness of very long-range terrestrial laser scanning (TLS) for analysis of the spatial distribution of a snowpack, to distances up to 3000 m, one of the longest measurement range reported to date. Snow depth data were collected using a terrestrial laser scanner during 11 periods of snow accumulation and melting, over three snow seasons on a Pyrenean hillslope characterized by a large elevational gradient, steep slopes, and avalanche occurrence. The maximum and mean absolute snow depth error found was 0.5-0.6 and 0.2-0.3 m respectively, which may result problematic for areas with a shallow snowpack, but it is sufficiently accurate to determine snow distribution patterns in areas characterized by a thick snowpack. The results indicated that in most cases there was temporal consistency in the spatial distribution of the snowpack, even in different years. The spatial patterns were particularly similar amongst the surveys conducted during the period dominated by snow accumulation (generally until end of April), or amongst those conducted during the period dominated by melting processes (generally after mid of April or early May). Simple linear correlation analyses for the 11 survey dates, and the application of Random Forests analysis to two days representative of snow accumulation and melting periods indicated the importance of topography to the snow distribution. The results also highlight that elevation and the Topographic Position index (TPI) were the main variables explaining the snow distribution, especially during periods dominated by melting. The intra- and inter-annual spatial consistency of the snowpack distribution suggests that the geomorphological processes linked to presence/absence of snow cover act in a similar way in the long term, and that these spatial patterns can be easily identified through several years of adequate monitoring. 相似文献
14.
Maldonado Andrés Carlos Balanyá Juan Barnolas Antonio Galindo-Zaldívar Jesús Hernández Javier Jabaloy Antonio Livermore Roy Miguel Martínez-Martínez José Rodríguez-Fernández José Sanz de Galdeano Carlos Somoza Luis Suriñach Emma Viseras César 《Marine Geophysical Researches》2000,21(1-2):43-68
New swath bathymetric, multichannel seismic and magnetic data reveal the complexity of the intersection between the extinct West Scotia Ridge (WSR) and the Shackleton Fracture Zone (SFZ), a first-order NW-SE trending high-relief ridge cutting across the Drake Passage. The SFZ is composed of shallow, ridge segments and depressions, largely parallel to the fracture zone with an `en echelon' pattern in plan view. These features are bounded by tectonic lineaments, interpreted as faults. The axial valley of the spreading center intersects the fracture zone in a complex area of deformation, where N120° E lineaments and E–W faults anastomose on both sides of the intersection. The fracture zone developed within an extensional regime, which facilitated the formation of oceanic transverse ridges parallel to the fracture zone and depressions attributed to pull-apart basins, bounded by normal and strike-slip faults.On the multichannel seismic (MCS) profiles, the igneous crust is well stratified, with numerous discontinuous high-amplitude reflectors and many irregular diffractions at the top, and a thicker layer below. The latter has sparse and weak reflectors, although it locally contains strong, dipping reflections. A bright, slightly undulating reflector observed below the spreading center axial valley at about 0.75 s (twt) depth in the igneous crust is interpreted as an indication of the relict axial magma chamber. Deep, high-amplitude subhorizontal and slightly dipping reflections are observed between 1.8 and 3.2 s (twt) below sea floor, but are preferentially located at about 2.8–3.0 s (twt) depth. Where these reflections are more continuous they may represent the Mohorovicic seismic discontinuity. More locally, short (2–3 km long), very high-amplitude reflections observed at 3.6 and 4.3 s (twt) depth below sea floor are attributed to an interlayered upper mantle transition zone. The MCS profiles also show a pattern of regularly spaced, steep-inclined reflectors, which cut across layers 2 and 3 of the oceanic crust. These reflectors are attributed to deformation under a transpressional regime that developed along the SFZ, shortly after spreading ceased at the WSR. Magnetic anomalies 5 to 5 E may be confidently identified on the flanks of the WSR. Our spreading model assumes slow rates (ca. 10–20 mm/yr), with slight asymmetries favoring the southeastern flank between 5C and 5, and the northwestern flank between 5 and extinction. The spreading rate asymmetry means that accretion was slower during formation of the steeper, shallower, southeastern flank than of the northwestern flank. 相似文献
15.
16.
Daniel Higuero Juan M. Tirado Jesús Carretero Fernando Félix Antonio de la Fuente 《Astrophysics and Space Science》2009,321(3-4):169-175
Institutions such as NASA, ESA or JAXA find solutions to distribute data from their missions to the scientific community, and their long term archives. This is a complex problem, as it includes a vast amount of data, several geographically distributed archives, heterogeneous architectures with heterogeneous networks, and users spread around the world. We propose a novel architecture that solves this problem aiming to fulfill the requirements of the final user. Our architecture is a modular system that provides a highly efficient parallel multiprotocol download engine, using a publisher/subscriber policy which helps the final user to obtain data of interest transparently. We have evaluated a first prototype, in collaboration with the ESAC centre in Villafranca del Castillo (Spain) that shows a high scalability and performance, opening a wide spectrum of opportunities. 相似文献
17.
Negative hydraulic barriers that intercept inflowing saltwater by pumping near the coast have been proposed as a corrective measure for seawater intrusion in cases where low heads must be maintained. The main disadvantage of these barriers is that they pump a significant proportion of freshwater, leading to contamination with saltwater at the well. To minimize such mixing, a double pumping barrier system with two extraction wells is proposed: an inland well to pump freshwater and a seawards well to pump saltwater. A three-dimensional variable density flow model is used to study the dynamics of the system. The system performs very efficiently as a remediation option in the early stages. Long-term performance requires a well-balanced design. If the pumping rate is high, drawdowns cause saltwater to flow along the aquifer bottom around the seawater well, contaminating the freshwater well. A low pumping rate at the seawards well leads to insufficient desalinization at the freshwater well. A critical pumping rate at the seawater well is defined as that which produces optimal desalinization at the freshwater well. Empirical expressions for the critical pumping rate and salt mass fraction are proposed. Although pumping with partially penetrating wells improves efficiency, the critical pumping rates remain unchanged. 相似文献
18.
Inverse problem in hydrogeology 总被引:8,自引:6,他引:8
Jesús Carrera Andrés Alcolea Agustín Medina Juan Hidalgo Luit J. Slooten 《Hydrogeology Journal》2005,13(1):206-222
The state of the groundwater inverse problem is synthesized. Emphasis is placed on aquifer characterization, where modelers have to deal with conceptual model uncertainty (notably spatial and temporal variability), scale dependence, many types of unknown parameters (transmissivity, recharge, boundary conditions, etc.), nonlinearity, and often low sensitivity of state variables (typically heads and concentrations) to aquifer properties. Because of these difficulties, calibration cannot be separated from the modeling process, as it is sometimes done in other fields. Instead, it should be viewed as one step in the process of understanding aquifer behavior. In fact, it is shown that actual parameter estimation methods do not differ from each other in the essence, though they may differ in the computational details. It is argued that there is ample room for improvement in groundwater inversion: development of user-friendly codes, accommodation of variability through geostatistics, incorporation of geological information and different types of data (temperature, occurrence and concentration of isotopes, age, etc.), proper accounting of uncertainty, etc. Despite this, even with existing codes, automatic calibration facilitates enormously the task of modeling. Therefore, it is contended that its use should become standard practice.
Resumen Se sintetiza el estado del problema inverso en aguas subterráneas. El énfasis se ubica en la caracterización de acuíferos, donde los modeladores tienen que enfrentar la incertidumbre del modelo conceptual (principalmente variabilidad temporal y espacial), dependencia de escala, muchos tipos de parámetros desconocidos (transmisividad, recarga, condiciones limitantes, etc), no linealidad, y frecuentemente baja sensibilidad de variables de estado (típicamente presiones y concentraciones) a las propiedades del acuífero. Debido a estas dificultades, no puede separarse la calibración de los procesos de modelado, como frecuentemente se hace en otros campos. En su lugar, debe de visualizarse como un paso en el proceso de entendimiento del comportamiento del acuífero. En realidad, se muestra que los métodos reales de estimación de parámetros no difieren uno del otro en lo esencial, aunque sí pueden diferir en los detalles computacionales. Se discute que existe amplio espacio para la mejora del problema inverso en aguas subterráneas: desarrollo de códigos amigables al usuario, acomodamiento de variabilidad a través de geoestadística, incorporación de información geológica y diferentes tipos de datos (temperatura, presencia y concentración de isótopos, edad, etc), explicación apropiada de incertidumbre, etc. A pesar de esto, aún con los códigos existentes, la calibración automática facilita enormemente la tarea de modelado. Por lo tanto, se sostiene que su uso debería de convertirse en práctica standard.
Résumé Létat du problème inverse des eaux souterraines est synthétisé. Laccent est placé sur la caractérisation de laquifère, où les modélisateurs doivent jouer avec lincertitude des modèles conceptuels (notamment la variabilité spatiale et temporelle), les facteurs déchelle, plusieurs inconnues sur différents paramètres (transmissivité, recharge, conditions aux limites, etc.), la non linéarité, et souvent la sensibilité de plusieurs variables détat (charges hydrauliques, concentrations) des propriétés de laquifère. A cause de ces difficultés, le calibrage ne peut être séparé du processus de modélisation, comme cest le cas dans dautres cas de figure. Par ailleurs, il peut être vu comme une des étapes dans le processus de détermination du comportement de laquifère. Il est montré que les méthodes dévaluation des paramètres actuels ne diffèrent pas si ce nest dans les détails des calculs informatiques. Il est montré quil existe une large panoplie de techniques d ‹inversion : codes de calcul utilisables par tout-un-chacun, accommodation de la variabilité via la géostatistique, incorporation dinformations géologiques et de différents types de données (température, occurrence, concentration en isotopes, âge, etc.), détermination de lincertitude. Vu ces développements, la calibration automatique facilite énormément la modélisation. Par ailleurs, il est souhaitable que son utilisation devienne une pratique standardisée.相似文献
19.
Vulnerability assessment in a volcanic risk evaluation in Central Mexico through a multi-criteria-GIS approach 总被引:1,自引:1,他引:1
José Fernando Aceves-Quesada Jesús Díaz-Salgado Jorge López-Blanco 《Natural Hazards》2007,40(2):339-356
The Valley of Toluca is a major industrial and agricultural area in Central Mexico, especially the City of Toluca, the capital
of The State of Mexico. The Nevado de Toluca volcano is located to the southwest of The Toluca Basin. Results obtained from
the vulnerability assessment phase of the study area (5,040 km2 and 42 municipalities) are presented here as a part of a comprehensive volcanic risk assessment of The Toluca Basin. Information
has been gathered and processed at a municipal level including thematic maps at 1:250,000 scale. A database has been built,
classified and analyzed within a GIS environment; additionally, a Multi-Criteria Evaluation (MCE) approach was applied as
an aid for the decision-making process. Cartographic results were five vulnerability maps: (1) Total Population, (2) Land
Use/Cover, (3) Infrastructure, (4) Economic Units and (5) Total Vulnerability. Our main results suggest that the Toluca and
Tianguistenco urban and industrial areas, to the north and northeast of The Valley of Toluca, are the most vulnerable areas,
for their high concentration of population, infrastructure, economic activity, and exposure to volcanic events. 相似文献
20.