首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   23篇
  免费   2篇
测绘学   1篇
大气科学   3篇
地球物理   13篇
地质学   5篇
自然地理   3篇
  2018年   2篇
  2017年   1篇
  2016年   1篇
  2015年   1篇
  2014年   1篇
  2013年   3篇
  2012年   1篇
  2011年   1篇
  2010年   3篇
  2008年   1篇
  2007年   1篇
  2005年   3篇
  2002年   1篇
  2000年   1篇
  1996年   1篇
  1971年   1篇
  1969年   2篇
排序方式: 共有25条查询结果,搜索用时 312 毫秒
1.
The National Tsunami Hazard Mitigation Program (NTHMP) Steering Committee consists of representatives from the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), the U.S. Geological Survey (USGS), and the states of Alaska, California, Hawaii, Oregon, and Washington. The program addresses three major components: hazard assessment, warning guidance, and mitigation. The first two components, hazard assessment and warning guidance, are led by physical scientists who, using research and modeling methods, develop products that allow communities to identify their tsunami hazard areas and receive more accurate and timely warning information. The third component, mitigation, is led by the emergency managers who use their experience and networks to translate science and technology into user-friendly planning and education products. Mitigation activities focus on assisting federal, state, and local officials who must plan for and respond to disasters, and for the public that is deeply affected by the impacts of both the disaster and the pre-event planning efforts. The division between the three components softened as NTHMP scientists and emergency managers worked together to develop the best possible products for the users given the best available science, technology, and planning methods using available funds.  相似文献   
2.
Prediction of magnitude of the largest potentially induced seismic event   总被引:1,自引:0,他引:1  
We propose a method for determining the possible magnitude of a potentially largest induced seismic event derived from the Gutenberg–Richter law and an estimate of total released seismic moment. We emphasize that the presented relationship is valid for induced (not triggered) seismicity, as the total seismic moment of triggered seismicity is not bound by the injection. The ratio of the moment released by the largest event and weaker events is determined by the constants a and b of the Gutenberg–Richter law. We show that for a total released seismic moment, it is possible to estimate number of events greater than a given magnitude. We determine the formula for the moment magnitude of a probable largest seismic event with one occurrence within the recurrence interval (given by one volumetric change caused by mining or injecting). Finally, we compare theoretical and measured values of the moment magnitudes of the largest induced seismic events for selected geothermal and hydraulic fracturing projects.  相似文献   
3.
Using a set of synthetic P‐ and S‐wave onsets, computed in a 1D medium model from sources that mimic a distribution of microseismic events induced by hydrofrac treatment to a monitoring geophone array(s), we test the possibility to invert back jointly the model and events location. We use the Neighbourhood algorithm for data inversion to account for non‐linear effects of velocity model and grid search for event location. The velocity model used is composed of homogeneous layers, derived from sonic logging. Results for the case of one and two monitoring wells are compared. These results show that the velocity model can be obtained in the case of two monitoring wells, if they have optimal relative position. The use of one monitoring well fails due to the trade‐off between the velocity model and event locations.  相似文献   
4.
Comparison of surface and borehole locations of induced seismicity   总被引:1,自引:0,他引:1  
Monitoring of induced microseismic events has become an important tool in hydraulic fracture diagnostics and understanding fractured reservoirs in general. We compare microseismic event and their uncertainties using data sets obtained with surface and downhole arrays of receivers. We first model the uncertainties to understand the effect of different acquisition geometries on location accuracy. For a vertical array of receivers in a single monitoring borehole, we find that the largest part of the final location uncertainty is related to estimation of the backazimuth. This is followed by uncertainty in the vertical position and radial distance from the receivers. For surface monitoring, the largest uncertainty lies in the vertical position due to the use of only a single phase (usually P‐wave) in the estimation of the event location. In surface monitoring results, lateral positions are estimated robustly and are not sensitive to the velocity model. In this case study, we compare event location solutions from two catalogues of microseismic events; one from a downhole array and the second from a surface array of 1C geophone. Our results show that origin time can be reliably used to find matching events between the downhole and surface catalogues. The locations of the corresponding events display a systematic shift consistent with a poorly calibrated velocity model for downhole dataset. For this case study, locations derived from surface monitoring have less scatter in both vertical and horizontal directions.  相似文献   
5.
6.
Abstract

The data-based mechanistic (DBM) modelling methodology is applied to the study of reservoir sedimentation. A lumped-parameter, discrete-time model has been developed which directly relates rainfall to suspended sediment load (SSL) at the reservoir outflow from the two years of measured data at Wyresdale Park Reservoir (Lancashire, UK). This nonlinear DBM model comprises two components: a rainfall to SSL model and a second model, relating the SSL at the reservoir inflow to the SSL at the reservoir spillway. Using a daily measured rainfall series as the input, this model is used to reconstruct daily deposition rates between 1911 and 1996. This synthetic sediment accretion sequence is compared with the variations in sand content within sediment cores collected from the reservoir floor. These profiles show good general agreement, reflecting the importance of low reoccurrence, high magnitude events. This preliminary study highlights the potential of this DBM approach, which could be readily applied to other sites.  相似文献   
7.
New palynological and sedimentological data from St. Lawrence Island present a rare view into late-glacial and Holocene environments of the central Bering Land Bridge. The late glaciation was a time of dynamic landscape changes in south-central Beringia, with active thermokarst processes, including the formation and drainage of thaw lakes. The presence of such a wet, unstable substrate, if widespread, probably would have had an adverse impact on food sources and mobility for many of the large mammal populations. The establishment of Betula shrub tundra on the island suggests late-glacial summers that were warmer than present, consistent with regional paleoclimatic interpretations. However, the increasing proximity to the Bering Sea, as postglacial sea levels rose, modified the intensity of warming and prevented the establishment of deciduous forest as found in other areas of Beringia at this time. The mid- to late Holocene is marked by more stable land surfaces and development of Sphagnum and Cyperaceae peat deposits. The accumulation of organic deposits, decline of shrub Betula, and decrease in thermokarst disturbance suggest that conditions were cooler than the previous. A recent decline in peat accumulation at the study sites may relate to local geomorphology, but similar decreases have been noted for other arctic regions.  相似文献   
8.
The continuing use of petrochemicals in mineral nitrogen (N) production may be affected by supply or cost issues and climate agreements. Without mineral N, a larger area of cropland is required to produce the same amount of food, impacting biodiversity. Alternative N sources include solar and wind to power the Haber-Bosch process, and the organic options such as green manures, marine algae and aquatic azolla. Solar power was the most land-efficient renewable source of N, with using a tenth as much land as wind energy, and at least 100th as much land as organic sources of N. In this paper, we developed a decision tree to locate these different sources of N at a global scale, or the first time taking into account their spatial footprint and the impact on terrestrial biodiversity while avoiding impact on albedo and cropland, based on global resource and impact datasets. This produced relatively few areas suitable for solar power in the western Americas, central southern Africa, eastern Asia and southern Australia, with areas most suited to wind at more extreme latitudes. Only about 2% of existing solar power stations are in very suitable locations. In regions such as coastal north Africa and central Asia where solar power is less accessible due to lack of farm income, green manures could be used, however, due to their very large spatial footprint only a small area of low productivity and low biodiversity was suitable for this option. Europe in particular faces challenges because it has access to a relatively small area which is suitable for solar or wind power. If we are to make informed decisions about the sourcing of alternative N supplies in the future, and our energy supply more generally, a decision-making mechanism is needed to take global considerations into account in regional land-use planning.  相似文献   
9.
To enhance global water use assessment, the WaterGAP3 model was improved for back-calculating domestic, manufacturing and thermoelectric water uses until 1950 for 177 countries. Model simulations were carried-out on a national scale to estimate water withdrawals and consumption as well as cooling water required for industrial processes and electricity production. Additionally, the amount of treated and untreated wastewater as generated by the domestic and manufacturing sectors was modeled. In the view of data availability, model simulations are based on key socio-economic driving forces and thermal electricity production. Technological change rates were derived from statistical records in order to consider developments in water use efficiency, which turned out to have a crucial role in water use dynamics. Simulated domestic and industrial water uses increased from ca. 300 km3 in 1950 to 1345 km3 in 2010, 12% of which were consumed and 88% of which were discharged back into freshwater bodies. The amount of domestic and manufacturing wastewater increased considerably over the last decade, but only half of it was untreated. The downscaling of the untreated wastewater volume to river basin scale indicates a matter of concern in East and Southeast Asia, Northern Africa, and Eastern and Southern Europe. In order to reach the Millennium Development Goals, securing water supply and the reduction of untreated wastewater discharges should be amongst the priority actions to be undertaken. Population growth and increased prosperity have led to increasing water demands. However, societal and political transformation processes as well as policy regulations resulting in new water-saving technologies and improvements counteract this development by slowing down and even reducing global domestic and industrial water uses.  相似文献   
10.
We have developed a method to calculate site and path effects for complex heterogeneous media using synthetic Green’s functions. The Green’s functions are calculated numerically by imposing body forces at the site of interest and then storing the reciprocal Green’s functions along arbitrary finite-fault surfaces. By using reciprocal Green’s functions, we can then simulate many source scenarios for those faults because the primary numerical calculations need be done only once. The advantage of the proposed method is shown by evaluation of the site and path effects for three sites in the vicinity of the Los Angeles basin using the Southern California Velocity Model (version 2.2, Magistrale et al., 2000). In this example, we have simulated 300 source scenarios for 5 major southern California faults and compared their responses for period longer then 3 seconds at the selected sites. However, a more detailed comparison with strong motion records will be necessary before a particular hazard assessment can be made. For the tested source scenarios the results show that the variations in the peak velocity amplitudes and durations due to a source scenarios are as large as variations due to a heterogeneous velocity model.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号