首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   22篇
  免费   0篇
测绘学   2篇
大气科学   1篇
地球物理   3篇
地质学   8篇
天文学   2篇
自然地理   6篇
  2021年   1篇
  2018年   1篇
  2017年   2篇
  2015年   1篇
  2014年   1篇
  2013年   3篇
  2012年   1篇
  2011年   1篇
  2010年   1篇
  2009年   2篇
  1999年   1篇
  1996年   1篇
  1995年   2篇
  1991年   2篇
  1987年   1篇
  1974年   1篇
排序方式: 共有22条查询结果,搜索用时 31 毫秒
1.
The processes of partial melting and magmatic diapirism within the lower crust are evaluated using a numerical underplating model. Fully molten basalt ( T = 1200°C) is emplaced at the Moho beneath a solid granite ( T = 750°C) in order that a melt front grows into the granite. If diapirism does not occur, this melt front in the granite reaches a minimal depth in the crust before (like in the molten basalt) crystallization takes place. the density contrast between the partially molten granite layer and the overlying solid granite can lead to a Rayleigh-Taylor instability (RTI) which results in diapiric rise of the partially molten granite. Assuming a binary eutectic system for both the granite and the underplating basalt and a temperature- and stress-dependent rheology for the granite, we numerically solve the governing equations and find (a) that diapirism occurs only within a certain but possibly realistic range of parameters, and (b) that if diapirs occur, they do not rise to levels shallower than 15 or perhaps 12km. the growth rate depends on the degree of melting and the thickness of the partially molten layer, as well as the viscosity of the solid and the partially molten granite. From a comparison of the growth rate with the velocity of a Stefan front it is possible to predict whether a melt front will become unstable and result in diapiric ascent or whether a partially molten layer is created, which remains at depth. We carry out such a comparison using our thermodynamically and thermomechanically consistent model of melting and diapirism.  相似文献   
2.
3.
Summary. The first DEKORP profile, DEKORP 2-S, a 250 km long line perpendicular to the Variscan strike direction, has provided evidence of major crustal shortening during the Variscan orogeny. Sporadic dipping events in a generally transparent upper crust are interpreted as thrust faults, while the highly reflective lower crust fits into the general picture of Palaeozoic provinces. Correlations are established between certain reflectivity patterns and rheology. Moho depths and reflecting lamellae are considered to be post-Variscan.  相似文献   
4.
The new Solar telescope GREGOR is designed to observe small‐scale dynamic magnetic structures below a size of 70 km on the Sun with high spectral resolution and polarimetric accuracy. For this purpose, the polarimetric concept of GREGOR is based on a combination of post‐focus polarimeters with pre‐focus equipment for high precision calibration. The Leibniz‐Institute for Astrophysics Potsdam developed the GREGOR calibration unit which is an integral part of the telescope. We give an overview of the function and design of the calibration unit and present the results of extensive testing series done in the Solar Observatory “Einsteinturm” and at GREGOR (© 2012 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   
5.
While the term “volunteered geographic information” (VGI) has become a buzzword in debates on the geoweb, online cartography and digital geoinformation, the scope and reach of VGI remains underexplored. Drawing on literature on social implications of VGI, this article, firstly, explores differences between VGI initiatives at the example of a comparative case study on social biases within data of OSM and Wikimapia in the fragmented social setting of Jerusalem, Israel. The results of this analysis turn out to be highly contradictive between both projects, which challenges widely accepted assumptions on the imprint of social inequalities and digital divides on VGI. This observation guides, secondly, a discussion of diversity within the category of VGI. Arguing that mapping communities, data formats and knowledge types behind VGI are extremely dissimilar, the paper proceeds by questioning the consistency and utility of VGI as a category. Seeking for a more comprehensive typology of VGI, Edney’s notion of cartographic modes will be presented as an approach towards a more contextualized understanding of VGI projects by embracing their underlying cultural, social and technical relations. Consequently, the paper suggests empirical research on the cartographic modes of a broad series of VGI projects through qualitative and quantitative methods alike.  相似文献   
6.
PEPSI is the bench‐mounted, two‐arm, fibre‐fed and stabilized Potsdam Echelle Polarimetric and Spectroscopic Instrument for the 2×8.4 m Large Binocular Telescope (LBT). Three spectral resolutions of either 43 000, 120 000 or 270 000 can cover the entire optical/red wavelength range from 383 to 907 nm in three exposures. Two 10.3k×10.3k CCDs with 9‐µm pixels and peak quantum efficiencies of 94–96 % record a total of 92 échelle orders. We introduce a new variant of a wave‐guide image slicer with 3, 5, and 7 slices and peak efficiencies between 92–96 %. A total of six cross dispersers cover the six wavelength settings of the spectrograph, two of them always simultaneously. These are made of a VPH‐grating sandwiched by two prisms. The peak efficiency of the system, including the telescope, is 15 % at 650 nm, and still 11 % and 10 % at 390 nm and 900 nm, respectively. In combination with the 110 m2 light‐collecting capability of the LBT, we expect a limiting magnitude of ≈20th mag in V in the low‐resolution mode. The R = 120 000 mode can also be used with two, dual‐beam Stokes IQUV polarimeters. The 270 000‐mode is made possible with the 7‐slice image slicer and a 100‐µm fibre through a projected sky aperture of 0.74″, comparable to the median seeing of the LBT site. The 43 000‐mode with 12‐pixel sampling per resolution element is our bad seeing or faint‐object mode. Any of the three resolution modes can either be used with sky fibers for simultaneous sky exposures or with light from a stabilized Fabry‐Pérot étalon for ultra‐precise radial velocities. CCD‐image processing is performed with the dedicated data‐reduction and analysis package PEPSI‐S4S. Its full error propagation through all image‐processing steps allows an adaptive selection of parameters by using statistical inferences and robust estimators. A solar feed makes use of PEPSI during day time and a 500‐m feed from the 1.8 m VATT can be used when the LBT is busy otherwise. In this paper, we present the basic instrument design, its realization, and its characteristics. Some pre‐commissioning first‐light spectra shall demonstrate the basic functionality. (© 2015 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   
7.
Our paper presents a theoretical approach to critical research on web 2.0 cartographies. Within the geoweb, dynamic and collaborative web based maps have become a popular medium for collating and communicating geographic information. Web 2.0 cartographies are often promoted as facilitating public participation and democratizing geographic knowledge. Such claims demand a closer look at the processes through which people do engage in these cartographic projects and the multiple actors, institutions, norms and technologies at work. In the context of ‘theorizing the geoweb’, here we propose conceptual tools for analyzing these myriad interactions within web 2.0 cartographies. We understand web 2.0 cartographies as assemblages of subjects, materialities and practices, or ‘actor networks’. Yet explorations of actor‐networks describe existing relations and as a consequence tend to overlook what has been excluded or lies outside of such assemblages. In order to overcome this blindness we suggest bringing together actor‐network theory with the concepts of hegemonic discourses, contingency and the political from Chantal Mouffe and Ernesto Laclau. These two political theorists stress the idea that specific social realities become fixed, sedimented and perceived as natural while other possible social realities become marginalized. Using the example of the dynamic ‘Palestine Crisis Map’ (an Ushahidi Crowdmap) we demonstrate a methodology that emphasizes sensitivity towards moments of exclusion and struggle, where the political unfolds. Theorizing the political in this way extends the processual approach within Critical Cartography and offers a conceptual basis for critical research on the social dimensions of web 2.0 cartographies and geoweb practices.  相似文献   
8.
The US Army Corps of Engineers (USACE) and the South Florida Water Management District (SFWMD) are partners in an ambitious plan to restore water flows throughout the Everglades ecosystem. An important component of the restoration plan involves storing excess stormwater deep underground in the Floridan Aquifer System using aquifer, storage and recovery (ASR) wells. In order to determine the optimal ASR system and to assess environmental impacts, USACE spent over 11 years and significant resources to develop a three-dimensional groundwater model of the Floridan Aquifer System covering a large portion of the Florida peninsula. This SEAWAT model is capable of evaluating changes in aquifer pressures and density-dependent flows in the entire study area. The model has been used to evaluate the Everglades ASR system already but could also be used by water managers for other important water resources studies in Florida including water supply estimates and adaptation to climate change. As part of an effort to make the model more readily available for other important studies, this study documents and summarizes the overall development of the SEAWAT model including a discussion regarding the intensive calibration and validation efforts undertaken during model development. The paper then demonstrates the use of the model using Everglades ASR project alternatives. Lastly, the paper outlines potential future uses of the model along with its overall limitations. Supplementary online resources are also included that provide researchers with further detail regarding the model development effort beyond the scope of this summary article as well as model development databases.  相似文献   
9.
On the design of formal theories of geographic space   总被引:1,自引:0,他引:1  
This paper discusses the design of formal theories of geographic space for the application in Geographic Information Systems. GIS software is an implementation of formal theories of geographic space. The notions of formal theories are introduced and discussed in the context of examples from the GIS field.  Our approach is an application of the general framework of formal theories to the special class of theories of geographic space, in particular to the geometry of geographic space. A framework is introduced for characterizing and evaluating formal theories of geographic space and the process of their design. This is used to provide (1) a classification of formal theories of geographic space, (2) criteria of their adequacy, and (3) an evaluation of design decisions in the process of formalization.  The paper demonstrates the choices in the design of GIS and the dependencies between these choices. Considering the design space for theories underlying a GIS, we can see that current GIS are based on one choice: analytical geometry. Other designs are possible and a systematic exploration of alternative types of GIS, for example, based on constraints or based on stored spatial relations, becomes necessary. Received: 30 April 1997/Accepted: 8 March 1999  相似文献   
10.
In a variety of biological and physical phenomena, temporal fluctuations are found, which are not explainable as consequences of statistically independent random events. If these fluctuations are characterized by a power spectrum density S(f) decaying as f at low frequencies, this behaviour is called 1/f noise.Counting statistics applied to earthquake activity data leads to three time scales with different characteristics, represented by the exponent : at interval lengths less than 1 h, the shocks are randomly distributed as in a Poisson process. For medium time intervals (1 day to 3 months), the exponent 1 + is larger (1.4 for M 0=3), but approaches unity for higher threshold magnitudes M 0. In longer time ranges the exponent assumes values near 1.55, however, with increasing statistical variation at higher M 0, due to lower counts.The temporal sequence is different from white noise; thus, it might be fruitful to apply neural network algorithms, because this method allows predictions in some other cases with similar characteristics.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号