The solid solution sanmartinite (ZnWO4)—cuproscheelite (CuWO4) has been studied using Cu 2p X-ray absorption spectroscopy. While a single L3 absorption peak is observed for CuWO4, two distinct L3 absorption peaks with a separation of ~0.8 eV are observed for the intermediate samples in the solid solution. The two peaks represent distinct Cu sites: one with all CuO6 next nearest neighbours in the (Cu,Zn)O6 chains, another having at least one ZnO6 next nearest neighbour. Both sites show a linear increase in covalency as a function of increasing Cu-content. The relative intensities of the two absorption peaks is dependent upon the Cu-content and has been used to model the site occupancies. The results reveal that the local structural effects can be associated with a composition-dependent structural phase transition from P2/c (ZnWO4) to P $\bar 1$ (CuWO4). Deviations from a single-site model are explained in terms of the local environments, and evidence for site preferences and local clustering are explored. 相似文献
Many low-efficiency hydrocarbon reservoirs are productive largely because effective reservoir permeability is controlled by faults and natural fractures. Accurate and low-cost information on basic fault and fracture properties, orientation in particular, is critical in reducing well costs and increasing well recoveries. This paper describes how we used an advanced numerical modelling technique, the finite element method (FEM), to compute site-specific in situ stresses and rock deformation and to predict fracture attributes as a function of material properties, structural position and tectonic stress. Presented are the numerical results of two-dimensional, plane-strain end-member FEM models of a hydrocarbon-bearing fault-propagation-fold structure. Interpretation of the modelling results remains qualitative because of the intrinsic limitations of numerical modelling; however, it still allows comparisons with (the little available) geological and geophysical data.
In all models, the weak mechanical strength and flow properties of a thick shale layer (the main seal) leads to a decoupling of the structural deformation of the shallower sediments from the underlying sediments and basement, and results in flexural slip across the shale layer. All models predict rock fracturing to initiate at the surface and to expand with depth under increasing horizontal tectonic compression. The stress regime for the formation of new fractures changes from compressional to shear with depth. If pre-existing fractures exist, only (sub)horizontal fractures are predicted to open, thus defining the principal orientation of effective reservoir permeability. In models that do not include a blind thrust fault in the basement, flexural amplification of the initial fold structure generates additional fracturing in the crest of the anticline controlled by the material properties of the rocks. The folding-induced fracturing expands laterally along the stratigraphic boundaries under enhanced tectonic loading. Models incorporating a blind thrust fault correctly predict the formation of secondary syn- and anti-thetic mesoscale faults in the basement and sediments of the hanging wall. Some of these faults cut reservoir and/or seal layers, and thus may influence effective reservoir permeability and affect seal integrity. The predicted faults divide the sediments across the anticline in several compartments with different stress levels and different rock failure (and proximity to failure). These numerical model outcomes can assist classic interpretation of seismic and well bore data in search of fractured and overpressured hydrocarbon reservoirs. 相似文献
The study examines the spatial relationships between sediment yield and 15 independent environmental variables in 54 catchments in South Africa. Rooseboom's (1978) data on the sediment yield from the catchments were standardized for a single time period. Bivariate regression analyses reveal no simple relationships. Multivariate regression analyses conducted for the whole and various sub-areas of South Africa indicate that latitude and longitude are the primary variables affecting spatial variations in sediment yield. This may be as a result of latitude and longitude being surrogate variables reflecting variation in other environmental variables (e. g. geology, vegetation). Within the sub-areas, 43.4% to 97.8% of the variation in sediment yield is explained by the combined variation in a number of different environmental variables. This study highlights a need for the collection and analysis of more sediment yield data, which would allow the analyses to be refined, to predict sediment yields from ungauged catchments in South Africa. 相似文献
A two-dimensional vertically integrated ice flow model has been developed to test the importance of various processes and
concepts used for the prediction of the contribution of the Greenland ice-sheet to sea-level rise over the next 350 y (short-term
response). The mass balance is modelled by the degree-day method and the energy-balance method. The lithosphere is considered
to respond isostatically to a point load and the time evolution of the bedrock follows from a viscous asthenosphere. According
to the IPCC-IS92a scenario (with constant aerosols after 1990) the Greenland ice-sheet is likely to cause a global sea level
rise of 10.4 cm by 2100 AD. It is shown, however, that the result is sensitive to precise model formulations and that simplifications
as used in the sea-level projection in the IPCC-96 report yield less accurate results. Our model results indicate that, on
a time scale of a hundred years, including the dynamic response of the ice-sheet yields more mass loss than the fixed response
in which changes in geometry are not incorporated. It appears to be important to consider sliding, as well as the fact that
climate sensitivity increases for larger perturbations. Variations in predicted sea-level change on a time scale of hundred
years depend mostly on the initial state of the ice-sheet. On a time scale of a few hundred years, however, the variability
in the predicted melt is dominated by the variability in the climate scenarios.
Received: 21 August 1996/Accepted: 12 May 1997 相似文献
While carbon pricing is widely seen as a crucial element of climate policy and has been implemented in many countries, it also has met with strong resistance. We provide a comprehensive overview of public perceptions of the fairness of carbon pricing and how these affect policy acceptability. To this end, we review evidence from empirical studies on how individuals judge personal, distributional and procedural aspects of carbon taxes and cap-and-trade. In addition, we examine preferences for particular redistributive and other uses of revenues generated by carbon pricing and their role in instrument acceptability. Our results indicate a high concern over distributional effects, particularly in relation to policy impacts on poor people, in turn reducing policy acceptability. In addition, people show little trust in the capacities of governments to put the revenues of carbon pricing to good use. Somewhat surprisingly, most studies do not indicate clear public preferences for using revenues to ensure fairer policy outcomes, notably by reducing its regressive effects. Instead, many people prefer using revenues for ‘environmental projects’ of various kinds. We end by providing recommendations for improving public acceptability of carbon pricing. One suggestion to increase policy acceptability is combining the redistribution of revenue to vulnerable groups with the funding for environmental projects, such as on renewable energy.
Key policy insights
If people perceive carbon pricing instruments as fair, this increases policy acceptability and support.
People’s satisfaction with information provided by the government about the policy instrument increases acceptability.
While people express high concern over uneven distribution of the policy burden, they often prefer using carbon pricing revenues for environmental projects instead of compensation for inequitable outcomes.
Recent studies find that people’s preferences shift to using revenues for making policy fairer if they better understand the functioning of carbon pricing, notably that relatively high prices of CO2-intensive goods and services reduce their consumption.
Combining the redistribution of revenue to support both vulnerable groups and environmental projects, such as on renewable energy, seems to most increase policy acceptability.
A three-step sequential extraction procedure with Milli-Q, CaCl2 and H3PO4 was applied for extraction of arsenic species in lichen transplants and airborne particulate matter (fine and coarse fractions). The samples used in this work were collected in 1994–1995 near coal-fired power plants. Both transplant lichens and airborne particulate matter were submitted to the same environment simultaneously. Arsenic species identification and quantification was performed by HPLC–UV–HG–AFS. Inorganic forms of arsenic (arsenite and arsenate) were present in significant amounts in most of the samples. Only in lichens also organic forms of arsenic (monomethyl arsonic acid and dimethyl arsinic acid) were identified which may indicate biotransformation of inorganic arsenic. 相似文献
The evolution of precipitating convective systems in West Africa has been a research topic throughout the past three decades and is considered to be influenced by surface–atmosphere interactions. This study builds on the previous research by examining the sensitivity of a mesoscale convective system (MCS) to a change in the vegetation cover by using a regional atmospheric model with a high horizontal resolution. Vegetation cover values in the region between 10 and 15°N have increased by 10–30% over the last 20 years. The effect of both an increase and a decrease in vegetation cover by 10, 20 and 30% is investigated. The MCS case selected occurred on 11 June 2006 and was observed during the African Monsoon Multidisciplinary Analysis field campaign in Dano, Burkina Faso. The model is able to reproduce the most important characteristics of the MCS and the atmospheric environment. For the investigated case, no clear precipitation response of the MCS to the applied vegetation scenarios is found. The vegetation changes do alter the surface fluxes in the days before the MCS arrives, which have a clear effect on the modelled convective available potential energy (CAPE) values. However, a link between CAPE, mesoscale circulation and rainfall amounts could not be demonstrated as a dynamical mechanism is found to counteract the CAPE signal. By using a kilometre-scale model, a change in the cold pool dynamics of the MCS could be detected which results from alterations in boundary layer moisture. The effect of vegetation changes on the MCS is thus not straightforward and a complex interaction between different processes should be taken into account. 相似文献
We describe a new calibration procedure included in the production process of Scintec’s displaced-beam laser scintillometers (SLS-20/40) and its effect on their measurement accuracy. The calibration procedure determines the factual displacement distances of the laser beams at the receiver and transmitter units, instead of assuming a prescribed displacement distance of 2.70 mm. For this study, four scintillometers operated by Wageningen University and the German Meteorological Service were calibrated by Scintec and their data re-analyzed. The results show that significant discrepancies may exist between the factual and the prescribed displacement distances. Generally, the factual displacement is about 0.1 mm smaller than 2.70 mm, but extremes varied between 0.04 and 0.24 mm. Correspondingly, using non-calibrated scintillometers may result in biases as large as 20 % in the estimates of the inner-scale length, $l_{0}$, the structure parameter of the refractive index, $C_{n_{_2}}$, and the friction velocity, $u_{*}$. The bias in the sensible heat flux was negligible, because biases in $C_{n_{_2}}$ and $u_{*}$ cancel. Hence, the discrepancies explain much of the long observed underestimations of $u_{*}$ determined by these scintillometers. Furthermore, the calibration improves the mutual agreement between the scintillometers for $l_{0}$, but especially for $C_{n_{_2}}$. Finally, it is noted that the measurement specifications of the scintillometer do not expire and hence the results of the calibration can be applied retroactively. 相似文献
Abstract A synthetic diurnal energy budget averaged for each month of the year shows that Lake Ontario loses very little heat at night during April, May, and June. The nightly losses during July, August, and September are conjectured to contribute significantly to the deepening of the thermocline through vertical convection. 相似文献
There has been much improvement in numerical weather prediction since L.F. Richardson (1922, Weather Prediction by Numerical Process, Cambridge University Press, Cambridge, p. 236) wrote his famous book. NWP has primarily been successful in improving day-by-day forecasts starting from an observed detailed Initial Condition (IC) out to about a week. The purpose of this paper is to discuss first the state of the art in long-range NWP by presenting results of a new large numerical experiment (named DERF90; from Dynamical Extended Range Forecasting in 1990 out to 90 days) conducted at the National Meteorological Center (NMC) during the summer and autumn of 1990 (Section 2). One hundred and twenty eight 90-day global forecasts were made from successive daily initial conditions (IC), thus giving us ample opportunity to assess skill of forecasts at lead times beyond 1 week.We then move on to define the notion of a limit of predictability (LOP), and following a procedure by Lorenz (1982), give a numerical estimate of the LOP using the DERF90 data set. We then produce a list of reasons, as to why this estimate (LOP = 18 days) should not be taken too literally. In particular, we argue that the LOP varies as a function of the flow itself, and it would be (much) larger if we had, as we will ultimately, a coupled ocean-atmosphere model for making long-lead forecasts. Last, but not least, we present results of empirical forecasts that point to modest but significant skill well beyond the traditional LOP (a few weeks).A specific recent example of empirical forecasting is discussed. Through Canonical Correlation Analysis (CCA), experimental forecasts are being made for the United States surface temperatures at lead times of several seasons. While modest, the skill is significant in that it defies the existence or a 3-week LOP, and so demonstrates the potential for model improvements. 相似文献