首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   12篇
  免费   2篇
  国内免费   1篇
地球物理   8篇
地质学   6篇
天文学   1篇
  2019年   3篇
  2017年   1篇
  2016年   1篇
  2012年   2篇
  2011年   1篇
  2009年   2篇
  2006年   2篇
  2004年   1篇
  2002年   1篇
  1996年   1篇
排序方式: 共有15条查询结果,搜索用时 31 毫秒
1.
We introduce a concept of generalized blending and deblending, develop its models and accordingly establish a method of deblended-data reconstruction using these models. The generalized models can handle real situations by including random encoding into the generalized operators both in the space and time domain, and both at the source and receiver side. We consider an iterative optimization scheme using a closed-loop approach with the generalized blending and deblending models, in which the former works for the forward modelling and the latter for the inverse modelling in the closed loop. We applied our method to existing real data acquired in Abu Dhabi. The results show that our method succeeded to fully reconstruct deblended data even from the fully generalized, thus quite complicated blended data. We discuss the complexity of blending properties on the deblending performance. In addition, we discuss the applicability to time-lapse seismic monitoring as it ensures high repeatability of the surveys. Conclusively, we should acquire blended data and reconstruct deblended data without serious problems but with the benefit of blended acquisition.  相似文献   
2.
Changes in the terms and direction of international trade in seafood, an increased understanding of and concern for the public health risk imposed by seafood products, and advances in information management technology combine to open opportunities to manage more effectively seafood-borne risk. Present regulatory mandates and programs lack sufficient integration for effective risk mitigation and do not adequately reflect the trans-national nature of seafood trade or the increased complexity of seafood production. This paper argues that the concept of a "chain of custody" - from the ocean to the final consumer - provides a useful integrating framework for understanding and refining efforts to reduce public health concerns surrounding the consumption of seafood.  相似文献   
3.
Blended acquisition along with efficient spatial sampling is capable of providing high-quality seismic data in a cost-effective and productive manner. While deblending and data reconstruction conventionally accompany this way of data acquisition, the recorded data can be processed directly to estimate subsurface properties. We establish a workflow to design survey parameters that account for the source blending as well as the spatial sampling of sources and detectors. The proposed method involves an iterative scheme to derive the survey design leading to optimum reflectivity and velocity estimation via joint migration inversion. In the workflow, we extend the standard implementation of joint migration inversion to cope with the data acquired in a blended fashion along with irregular detector and source geometries. This makes a direct estimation of reflectivity and velocity models feasible without the need of deblending or data reconstruction. During the iterations, the errors in reflectivity and velocity estimates are used to update the survey parameters by integrating a genetic algorithm and a convolutional neural network. Bio-inspired operators enable the simultaneous update of the blending and sampling operators. To relate the choice of survey parameters to the performance of joint migration inversion, we utilize a convolutional neural network. The applied network architecture discards suboptimal solutions among newly generated ones. Conversely, it carries optimal ones to the subsequent step, which improves the efficiency of the proposed approach. The resultant acquisition scenario yields a notable enhancement in both reflectivity and velocity estimation attributable to the choice of survey parameters.  相似文献   
4.
Three‐dimensional seismic survey design should provide an acquisition geometry that enables imaging and amplitude‐versus‐offset applications of target reflectors with sufficient data quality under given economical and operational constraints. However, in land or shallow‐water environments, surface waves are often dominant in the seismic data. The effectiveness of surface‐wave separation or attenuation significantly affects the quality of the final result. Therefore, the need for surface‐wave attenuation imposes additional constraints on the acquisition geometry. Recently, we have proposed a method for surface‐wave attenuation that can better deal with aliased seismic data than classic methods such as slowness/velocity‐based filtering. Here, we investigate how surface‐wave attenuation affects the selection of survey parameters and the resulting data quality. To quantify the latter, we introduce a measure that represents the estimated signal‐to‐noise ratio between the desired subsurface signal and the surface waves that are deemed to be noise. In a case study, we applied surface‐wave attenuation and signal‐to‐noise ratio estimation to several data sets with different survey parameters. The spatial sampling intervals of the basic subset are the survey parameters that affect the performance of surface‐wave attenuation methods the most. Finer spatial sampling will reduce aliasing and make surface‐wave attenuation easier, resulting in better data quality until no further improvement is obtained. We observed this behaviour as a main trend that levels off at increasingly denser sampling. With our method, this trend curve lies at a considerably higher signal‐to‐noise ratio than with a classic filtering method. This means that we can obtain a much better data quality for given survey effort or the same data quality as with a conventional method at a lower cost.  相似文献   
5.
This article focuses on the dynamics of using numbers to construct an image of social reality in disaster areas. Numbers are neither objective nor value-neutral but are rather generated, transmitted and shared with social signification. In other words, numbers can be thought of as simply socially constructed information. Statistics and other numbers usually work in positive ways. However, it is also possible that using numbers in the media can lead to unintended messages that could produce negative consequences. We conducted field studies in disaster-stricken areas of the 2008 Wenchuan earthquake in China and compared findings to the case of the 1995 Kobe earthquake in Japan in order to examine how numbers—in terms of the amount of donations, the timeline of reconstruction projects and casualty figures—construct social reality and cause a variety of social dysfunctions.  相似文献   
6.
High-pressure metamorphic rocks exposed in the Bantimala area, c . 40  km north-east of Ujung Pandang, were formed as a Cretaceous subduction complex with fault-bounded slices of melange, chert, basalt, turbidite, shallow-marine sedimentary rocks and ultrabasic rocks. Eclogites, garnet–glaucophane rocks and schists of the Bantimala complex have estimated peak temperatures of T  =580–630 °C at 18  kbar and T  =590–640 °C at 24  kbar, using the garnet–clinopyroxene geothermometer. The garnet–omphacite–phengite equilibrium is used to estimate pressures. The distribution coefficient K D1=[( X pyr)3( X grs)6/( X di)6]/[(Al/Mg)M2,wm (Al/Si)T2,wm]3 among omphacite, garnet and phengite is a good index for metamorphic pressures. The K D1values of the Bantimala eclogites were compared with those of eclogites with reliable P–T  estimates. This comparison suggests that peak pressures of the Bantimala eclogites were P =18–24  kbar at T  =580–640 °C. These results are consistent with the P–T  range calculated using garnet–rutile–epidote–quartz and lawsonite–omphacite–glaucophane–epidote equilibria.  相似文献   
7.
High‐grade mylonites occur in the Takahama metamorphic rocks, a member of the high‐pressure low‐temperature type Nagasaki Metamorphic Rocks, western Kyushu, Japan. Mafic layers within the mylonites retain reaction microstructures consisting of margarite aggregates armoring both corundum and kyanite. The following retrograde reaction well accounts for the microstructures in the CaO–Al2O3–SiO2–H2O system: 3Al2O3 + 2Al2SiO5 + 2Ca2Al3Si3O12(OH) + 3H2O = 2Ca2Al8Si4O20(OH)4 (corundum + kyanite + clinozoisite + fluid = margarite). Mass balance analyses and chemical potential modeling reveal that the chemical potential gradients present between kyanite and corundum have likely driven the transport of the CaO and SiO2 components. The mylonitization is considered to take place chronologically after peak metamorphism and before the above reaction, based on the following features: approximately constant thickness of the margarite aggregates, random orientation of margarite, and local modification of garnet composition at a boudin neck that formed during mylonitization. The estimated peak temperature of 640°C and the pressure–temperature conditions of the above reaction indicate that the mylonitization took place at temperature between 530 and 640°C at pressures higher than 1.2 GPa, approximately equivalent to the depth of the lower crust of island arcs.  相似文献   
8.
Abstract— Minor element (Ca, Cr, and Mn) concentrations in amoeboid olivine aggregates (AOAs) from primitive chondrites were measured and compared with those predicted by equilibrium condensation in the solar nebula. CaO concentrations in forsterite are low, particularly in porous aggregates. A plausible explanation appears that an equilibrium Ca activity was not maintained during the olivine condensation. CaO and MnO in forsterite are negatively correlated, with CaO being higher in compact aggregates. This suggests that the compact aggregates formed either by a prolonged reheating of the porous aggregates or by condensation and aggregation of forsterite during a very slow cooling in the nebula.  相似文献   
9.
The application of blended acquisition has drawn considerable attention owing to its ability to improve the operational efficiency as well as the data quality and health, safety and environment performance. Furthermore, the acquisition of less data contributes to the business aspect, while the desired data density is still realizable via subsequent data reconstruction. The use of fewer detectors and sources also minimizes operational risks in the field. Therefore, a combined implementation of these technologies potentially enhances the value of a seismic survey further. One way to encourage this is to minimize any imperfection in deblending and data reconstruction during processing. In addition, one may derive survey parameters that enable a further improvement in these processes as introduced in this study. The proposed survey design workflow iteratively performs the following steps to derive the survey parameters responsible for source blending as well as the spatial sampling of detectors and sources. The first step is the application of blending and sampling operators to unblended and well-sampled data. We then apply closed-loop deblending and data reconstruction. The residue for a given design from this step is evaluated and subsequently used by genetic algorithms to simultaneously update the survey parameters related to both blending and spatial sampling. The updated parameters are fed into the next iteration until they satisfy the given termination criteria. We also propose a repeated encoding sequence to form a parameter sequence in genetic algorithms, making the size of problem space manageable. The results of the proposed workflow are outlined using blended dispersed source array data incorporating different scenarios that represent acquisition in marine, transition zone and land environments. Clear differences attributed solely to the parameter design are easily recognizable. Additionally, a comparison among different optimization schemes illustrates the ability of genetic algorithms along with a repeated encoding sequence to find better solutions within a computationally affordable time. The optimized parameters yield a notable enhancement in the deblending and data reconstruction quality and consequently provide optimal acquisition scenarios.  相似文献   
10.
The Oto-Zan lava in the Setouchi volcanic belt is composed ofphenocryst-poor, sparsely plagioclase-phyric andesites (sanukitoids)and forms a composite lava flow. The phenocryst assemblagesand element abundances change but Sr–Nd–Pb isotopiccompositions are constant throughout the lava flow. The sanukitoidat the base is a high-Mg andesite (HMA) and contains Mg- andNi-rich olivine and Cr-rich chromite, suggesting the emplacementof a mantle-derived hydrous (7 wt % H2O) HMA magma. However,Oto-Zan sanukitoids contain little H2O and are phenocryst-poor.The liquid lines of descent obtained for an Oto-Zan HMA at 0·3GPa in the presence of 0·7–2·1 wt % H2Osuggest that mixing of an HMA magma with a differentiated felsicmelt can reasonably explain the petrographical and chemicalcharacteristics of Oto-Zan sanukitoids. We propose a model wherebya hydrous HMA magma crystallizes extensively within the crust,resulting in the formation of an HMA pluton and causing liberationof H2O from the magma system. The HMA pluton, in which interstitialrhyolitic melts still remain, is then heated from the base byintrusion of a high-T basalt magma, forming an H2O-deficientHMA magma at the base of the pluton. During ascent, this secondaryHMA magma entrains the overlying interstitial rhyolitic melt,resulting in variable self-mixing and formation of a zoned magmareservoir, comprising more felsic magmas upwards. More effectiveupwelling of more mafic, and hence less viscous, magmas througha propagated vent finally results in the emplacement of thecomposite lava flow. KEY WORDS: high-Mg andesite; sanukitoid; composite lava; solidification; remelting  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号