首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   99篇
  免费   6篇
  国内免费   1篇
测绘学   4篇
大气科学   1篇
地球物理   43篇
地质学   28篇
天文学   21篇
自然地理   9篇
  2022年   2篇
  2021年   1篇
  2020年   5篇
  2019年   4篇
  2018年   1篇
  2017年   5篇
  2016年   5篇
  2015年   3篇
  2014年   4篇
  2013年   4篇
  2012年   3篇
  2011年   9篇
  2010年   1篇
  2009年   3篇
  2008年   9篇
  2007年   8篇
  2006年   4篇
  2005年   6篇
  2004年   11篇
  2003年   1篇
  2002年   1篇
  2001年   3篇
  2000年   1篇
  1999年   2篇
  1998年   2篇
  1997年   1篇
  1996年   2篇
  1993年   2篇
  1990年   2篇
  1989年   1篇
排序方式: 共有106条查询结果,搜索用时 15 毫秒
81.
Since the 1970s, multiple reconstruction techniques have been proposed and are currently used, to extrapolate and quantify eruptive parameters from sampled tephra fall deposit datasets. Atmospheric transport and deposition processes strongly control the spatial distribution of tephra deposit; therefore, a large uncertainty affects mass derived estimations especially for fall layer that are not well exposed. This paper has two main aims: the first is to analyse the sensitivity to the deposit sampling strategy of reconstruction techniques. The second is to assess whether there are differences between the modelled values for emitted mass and grainsize, versus values estimated from the deposits. We find significant differences and propose a new correction strategy. A numerical approach is demonstrated by simulating with a dispersal code a mild explosive event occurring at Mt. Etna on 24 November 2006. Eruptive parameters are reconstructed by an inversion information collected after the eruption. A full synthetic deposit is created by integrating the deposited mass computed by the model over the computational domain (i.e., an area of 7.5 × 104 km 2). A statistical analysis based on 2000 sampling tests of 50 sampling points shows a large variability, up to 50 % for all the reconstruction techniques. Moreover, for some test examples Power Law errors are larger than estimated uncertainty. A similar analysis, on simulated grain-size classes, shows how spatial sampling limitations strongly reduce the utility of available information on the total grain size distribution. For example, information on particles coarser than ?(?4) is completely lost when sampling at 1.5 km from the vent for all columns with heights less than 2000 m above the vent. To correct for this effect an optimal sampling strategy and a new reconstruction method are presented. A sensitivity study shows that our method can be extended to a wide range of eruptive scenarios including those in which aggregation processes are important. The new correction method allows an estimate of the deficiency for each simulated class in calculated mass deposited, providing reliable estimation of uncertainties in the reconstructed total (whole deposit) grainsize distribution.  相似文献   
82.
We compare the performances of four stochastic optimisation methods using four analytic objective functions and two highly non‐linear geophysical optimisation problems: one‐dimensional elastic full‐waveform inversion and residual static computation. The four methods we consider, namely, adaptive simulated annealing, genetic algorithm, neighbourhood algorithm, and particle swarm optimisation, are frequently employed for solving geophysical inverse problems. Because geophysical optimisations typically involve many unknown model parameters, we are particularly interested in comparing the performances of these stochastic methods as the number of unknown parameters increases. The four analytic functions we choose simulate common types of objective functions encountered in solving geophysical optimisations: a convex function, two multi‐minima functions that differ in the distribution of minima, and a nearly flat function. Similar to the analytic tests, the two seismic optimisation problems we analyse are characterised by very different objective functions. The first problem is a one‐dimensional elastic full‐waveform inversion, which is strongly ill‐conditioned and exhibits a nearly flat objective function, with a valley of minima extended along the density direction. The second problem is the residual static computation, which is characterised by a multi‐minima objective function produced by the so‐called cycle‐skipping phenomenon. According to the tests on the analytic functions and on the seismic data, genetic algorithm generally displays the best scaling with the number of parameters. It encounters problems only in the case of irregular distribution of minima, that is, when the global minimum is at the border of the search space and a number of important local minima are distant from the global minimum. The adaptive simulated annealing method is often the best‐performing method for low‐dimensional model spaces, but its performance worsens as the number of unknowns increases. The particle swarm optimisation is effective in finding the global minimum in the case of low‐dimensional model spaces with few local minima or in the case of a narrow flat valley. Finally, the neighbourhood algorithm method is competitive with the other methods only for low‐dimensional model spaces; its performance sensibly worsens in the case of multi‐minima objective functions.  相似文献   
83.
A realistic definition of seismic input for the Catania area is obtained using advanced modeling techniques that allow us the computation of synthetic seismograms, containing body and surface waves. With the modal summation technique, extended to laterally heterogeneous anelastic structural models, we create a database of synthetic signals which can be used for the study of the local response in a set of selected sites located within the Catania area. We propose a ground shaking scenario corresponding to a source spectrum of an earthquake that mimics the destructive event that occurred on 11 January 1693. Making use of the simplified geotechnical map for the Catania area, we produce maps which illustrate the spatial variability of the SH waveforms over the entire area. Using the detailed geological and geotechnical information along a selected cross section, we study the site response to the SH and P-SV motion in a very realistic case, adopting and comparing different estimation techniques.  相似文献   
84.
Continuous GPS (CGPS) data, collected at Mt. Etna between April 2012 and October 2013, clearly define inflation/deflation processes typically observed before/after an eruption onset. During the inflationary process from May to October 2013, a particular deformation pattern localised in the upper North Eastern sector of the volcano suggests that a magma intrusion had occurred a few km away from the axis of the summit craters, beneath the NE Rift system. This is the first time that this pattern has been recorded by CGPS data at Mt. Etna. We believe that this inflation process might have taken place periodically at Mt. Etna and might be associated with the intrusion of batches of magma that are separate from the main feeding system. We provide a model to explain this unusual behaviour and the eruptive regime of this rift zone, which is characterised by long periods of quiescence followed by often dangerous eruptions in which vents can open at low elevation and thus threaten the villages in this sector of the volcano.  相似文献   
85.
The current work presents the testing of a modeling strategy that has been recently developed to simulate the gross and net carbon fluxes of Mediterranean forest ecosystems. The strategy is based on the use of a NDVI-driven parametric model, C-Fix, and of a biogeochemical model, BIOME-BGC, whose outputs are combined to simulate the behavior of forest ecosystems at different development stages. The performances of the modeling strategy are evaluated in three Italian study sites (San Rossore, Lecceto and Pianosa), where carbon fluxes are being measured through the eddy correlation technique. These sites are characterized by variable Mediterranean climates and are covered by different types of forest vegetation (pine wood, Holm oak forest and Macchia, respectively). The results of the tests indicate that the modeling strategy is generally capable of reproducing monthly GPP and NEE patterns in all three study sites. The highest accuracy is obtained in the most mature, homogenous pine wood of San Rossore, while the worst results are found in the Lecceto forest, where there are the most heterogeneous terrain, soil and vegetation conditions. The main error sources are identified in the inaccurate definition of the model inputs, particularly those regulating the site water budgets, which exert a strong control on forest productivity during the Mediterranean summer dry season. In general, the incorporation of NDVI-derived fAPAR estimates corrects for most of these errors and renders the forest flux simulations more stable and accurate.  相似文献   
86.
Permanent downhole sensors provide the eyes and ears to the reservoir and enable monitoring the reservoir conditions on a real‐time basis. In particular, the use of sensors and remotely controlled valves in wells and on the surface, in combination with reservoir flow models provide enormous benefits to reservoir management and oil production. We suggest borehole radar measurements as a promising technique capable to monitor the arrival of undesired fluids in the proximity of production wells. We use 1D modelling to investigate the expected signal magnitude and depth of investigation of a borehole radar sensor operating in an oilfield environment. We restrict the radar applicability to environments where the radar investigation depth can fit the reservoir size necessary to be monitored. Potential applications are steam chamber monitoring in steam assisted gravity drainage processes and water front monitoring in thin oil rim environments. A more sophisticated analysis of the limits of a radar system is carried out through 2D finite‐difference time‐domain simulations. The metal components of the wellbore casing can cause destructive interference with the emitted signal. A high dielectric medium surrounding the production well increases the amplitude of the signal and so the radar performance. Other reservoir constraints are given by the complexity of the reservoir and the dynamic of the fluids. Time‐lapse changes in the heterogeneity of the background formation strongly affect the retrieval of the target reflections and gradual fluid saturation changes reduce the amplitudes of the reflections.  相似文献   
87.
Summary The analytical expressions used to compute the partial derivatives of phase and group velocity of Rayleigh waves with respect to the P- and S-wave velocity and the density are derived and the related computer code is developed. The results of the analytical computations were satisfactorily tested against numerically determined values. Several examples of partial derivatives for a given structural model are presented.  相似文献   
88.
Interest in high-resolution satellite imagery (HRSI) is spreading in several application fields, at both scientific and commercial levels. Fundamental and critical goals for the geometric use of this kind of imagery are their orientation and orthorectification, processes able to georeference the imagery and correct the geometric deformations they undergo during acquisition. In order to exploit the actual potentialities of orthorectified imagery in Geomatics applications, the definition of a methodology to assess the spatial accuracy achievable from oriented imagery is a crucial topic.In this paper we want to propose a new method for accuracy assessment based on the Leave-One-Out Cross-Validation (LOOCV), a model validation method already applied in different fields such as machine learning, bioinformatics and generally in any other field requiring an evaluation of the performance of a learning algorithm (e.g. in geostatistics), but never applied to HRSI orientation accuracy assessment.The proposed method exhibits interesting features which are able to overcome the most remarkable drawbacks involved by the commonly used method (Hold-Out Validation — HOV), based on the partitioning of the known ground points in two sets: the first is used in the orientation–orthorectification model (GCPs — Ground Control Points) and the second is used to validate the model itself (CPs — Check Points). In fact the HOV is generally not reliable and it is not applicable when a low number of ground points is available.To test the proposed method we implemented a new routine that performs the LOOCV in the software SISAR, developed by the Geodesy and Geomatics Team at the Sapienza University of Rome to perform the rigorous orientation of HRSI; this routine was tested on some EROS-A and QuickBird images. Moreover, these images were also oriented using the world recognized commercial software OrthoEngine v. 10 (included in the Geomatica suite by PCI), manually performing the LOOCV since only the HOV is implemented.The software comparison guaranteed about the overall correctness and good performances of the SISAR model, whereas the results showed the good features of the LOOCV method.  相似文献   
89.
Data discoverability, accessibility, and integration are frequent barriers for scientists and a major obstacle for favorable results on environmental research. To tackle this issue, the Group on Earth Observations (GEO) is leading the development of the Global Earth Observation System of Systems (GEOSS), a voluntary effort that connects Earth Observation resources world‐wide, acting as a gateway between producers and users of environmental data. GEO recognizes the importance of capacity building and education to reach large adoption, acceptance and commitment on data sharing principles to increase the capacity to access and use Earth Observations data. This article presents “Bringing GEOSS services into practice” (BGSIP), an integrated set of teaching material and software to facilitate the publication and use of environmental data through standardized discovery, view, download, and processing services, further facilitating the registration of data into GEOSS. So far, 520 participants in 10 countries have been trained using this material, leading to numerous Spatial Data Infrastructure implementations and 1,000 tutorial downloads. This workshop lowers the entry barriers for both data providers and users, facilitates the development of technical skills, and empowers people.  相似文献   
90.
An innovative approach to seismic hazard assessment is illustrated that, based on the available knowledge of the physical properties of the Earth structure and of seismic sources, on geodetic observations, as well as on the geophysical forward modeling, allows for a time-dependent definition of the seismic input. According to the proposed approach, a fully formalized system integrating Earth Observation data and new advanced methods in seismological and geophysical data analysis is currently under development in the framework of the Pilot Project SISMA, funded by the Italian Space Agency. The synergic use of geodetic Earth Observation data (EO) and Geophysical Forward Modeling deformation maps at the national scale complements the space- and time-dependent information provided by real-time monitoring of seismic flow (performed by means of the earthquake prediction algorithms CN and M8S) and permits the identification and routine updating of alerted areas. At the local spatial scale (tens of km) of the seismogenic nodes identified by pattern-recognition analysis, both GNSS (Global Navigation Satellite System) and SAR (Synthetic Aperture Radar) techniques, coupled with expressly developed models for interseismic phase, allow us to retrieve the deformation style and stress evolution within the seismogenic areas. The displacement fields obtained from EO data provide the input for the geophysical modeling, which eventually permits to indicate whether a specific fault is in a “critical state.” The scenarios of expected ground motion (shakemaps) associated with the alerted areas are then defined by means of full waveforms modeling, based on the possibility to compute synthetic seismograms by the modal summation technique (neo-deterministic hazard assessment). In this way, a set of deterministic scenarios of ground motion, which refer to the time interval when a strong event is likely to occur within the alerted area, can be defined both at national and at local scale. The considered integrated approach opens new routes in understanding the dynamics of fault zones as well as in modeling the expected ground motion. The SISMA system, in fact, provides tools for establishing warning criteria based on deterministic and rigorous forward geophysical models and hence allows for a well-controlled real-time prospective testing and validation of the proposed methodology over the Italian territory. The proposed approach complements the traditional probabilistic approach for seismic hazard estimates, since it supplies routinely updated information useful in assigning priorities for timely mitigation actions and hence it is particularly relevant to Civil Defense purposes.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号