首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
In this work, we carried out a preliminary study of traffic-derived pollutants from primary sources (vehicles), and on roads (paved area), road borders and surroundings areas. The study is focussed on the identification, distribution and concentration of pollutants and magnetic carriers. Magnetic parameters and their analyses suggest that the magnetic signal of vehicle-derived emissions is controlled by a magnetite-like phase. Magnetic grain size estimations reveal the presence of fine particles (0.1–5 μm) that can be inhaled and therefore are dangerous to human health. Magnetic susceptibility results (about 175 × 10−5 SI) show a higher magnetic concentration — magnetic enhancement — in the central area of the tollbooth line that is related to higher traffic. In addition, magnetic susceptibility was computed on several roadside soils along a length of 120 km and used to generate a 2-D contour map, which shows higher magnetic values (100–200 10−5 SI) near the edge of the road. The observed distribution of magnetic values indicates that magnetic particles emitted by vehicles are accumulated and mainly concentrated within a distance of several meters (1–2 m) from the edge of the road. In consequence, the magnetic susceptibility parameter seems to be a suitable indicator of traffic-related pollution. Non-magnetic studies show an enrichment of some trace elements, such as Ba, Cr, Cu, Zn and Pb, that are associated with traffic pollution. Furthermore, statistical correlations between the content of toxic trace metals and magnetic variables support the use of magnetic parameters as potential proxies for traffic-related pollution in this study area.  相似文献   

2.
The results of microbiological water quality monitoring in the Amur and Sungary rivers after a technogenic accident in Tzilin province (China) in November 2005 are considered. Bioindication technique is used to show that various pollutants earlier entered the Amur with the Sungari runoff: low-molecular volatile benzene derivatives entered from November 28 to December 2; naphthalene derivatives entered from November 29 to December 7; and nitrobenzene and high-molecular polyaromatic hydrocarbons entered in December 15–23, 2005. The major portion of pollutants moved along the right bank. The most sensitive to the total pollution by toxic substances were nitrifying bacteria, especially in bottom water layers. The Amur water in the zone of influence of the Sungari was estimated to belong to the IV–V quality class and referred to the categories of “polluted” and “dirty.” The poor quality of Amur water persisted over 9 months throughout the Amur reach from the Sungari mouth to Khabarovsk City.  相似文献   

3.
Air pollution is one of the most important threats for the humanity. It can damage not only human health but also Earth’s ecosystem. Because of the harmful effects of air pollution, it should be controlled very carefully. To do the risk assessment of air pollution in Istanbul, the process capability indices (PCIs) which are very effective statistics to summarize the performance of process are used in this paper. Fuzzy PCIs are used to determine the levels of the air pollutants which are measured in different nine stations in Istanbul. Robust PCIs (RPCIs) are used when air pollutants have correlation. Fuzzy set theory has been applied for both PCIs and RPCIs to have more sensitive results. More flexible PCIs obtained by using fuzzy specification limits and fuzzy standard deviation are used to evaluate the air pollution’s level of Istanbul. Additionally some evaluation criteria have been constructed for fuzzy PCIs to interpret the air pollution.  相似文献   

4.
 Although the strict legislation regarding vehicle emissions in Europe (EURO 4, EURO 5) will lead to a remarkable reduction of emissions in the near future, traffic related air pollution still can be problematic due to a large increase of traffic in certain areas. Many dispersion models for line-sources have been developed to assess the impact of traffic on the air pollution levels near roads, which are in most cases based on the Gaussian equation. Previous studies gave evidence, that such kind of models tend to overestimate concentrations in low wind speed conditions or when the wind direction is almost parallel to the street orientation. This is of particular interest, since such conditions lead generally to the highest observed concentrations in the vicinity of streets. As many air quality directives impose limits on high percentiles of concentrations, it is important to have good estimates of these quantities in environmental assessment studies. The objective of this study is to evaluate a methodology for the computation of especially those high percentiles required by e.g. the EU daughter directive 99/30/EC (for instance the 99.8 percentile for NO2). The model used in this investigation is a Markov Chain – Monte Carlo model to predict pollutant concentrations, which performs well in low wind conditions as is shown here. While usual Lagrangian models use deterministic time steps for the calculation of the turbulent velocities, the model presented here, uses random time steps from a Monte Carlo simulation and a Markov Chain simulation for the sequence of the turbulent velocities. This results in a physically better approach when modelling the dispersion in low wind speed conditions. When Lagrangian dispersion models are used for regulatory purposes, a meteorological pre-processor is necessary to obtain required input quantities like Monin-Obukhov length and friction velocity from routinely observed data. The model and the meteorological pre-processor applied here, were tested against field data taken near a major motorway south of Vienna. The methodology used is based on input parameters, which are also available in usual environmental assessment studies. Results reveal that the approach examined is useful and leads to reasonable concentration levels near motorways compared to observations. We wish to thank Andreas Schopper (Styrian Government) for providing air quality values, M. Kalina for providing the raw data of the air quality stations near the motorway and J. Kukkonen for providing the road site data set from the Finish Meteorological Institute (FMI). The study was partly funded by the Austrian science fund under the project P14075-TEC.  相似文献   

5.
The Kyoto Protocol calls for industrialized nations to cut greenhouse gas emissions by 5% from 1990 levels by 2008–2012, so developed countries are presenting various policies to reduce greenhouse gas that is produced in transport fields. One of those polices is a modal shift that changes from road freight to sea, inland waterways and railroad transportation that is eco-friendly. The increase of road freight brings road congestion, accidents, logistic costs, air pollution and greenhouse gases. Railroads are superior to the other modes of transportation in mass transportability, high speed, timeliness, safety and environmental-friendliness, but the railway industry has been pushed behind in competition. Korean railroads were used by passengers and for freight transport popularly until the middle of the 20th century. However, because of the sudden change of logistics environments, having time efficiency being most important, railroads lost logistic competitive power against the transportation by truck. This paper intends to examine the modal shift to railroad transportation, which enjoys high interest as environmentally-friendly logistics, i.e., the modal shift strategy. Efficiency analysis is conducted using DEA and exploratory factors are identified for the modal shift of the companies. This paper also proposes an alternative plan for green logistics and environmentally-friendly logistics, by analyzing the characteristics of the railroad cargo transportation system and the conditions of local railroad cargo transportation.  相似文献   

6.
Zusammenfassung Aus Windprofilmessungen in feststehenden H?hen über den Wellenk?mmen lassen sich keine zuverl?ssigen Schlüsse über die effektive Schubkraft des Windes an der Meeresoberfl?che ziehen. Der Grund liegt im wesentlichen in der verschiedenartigen vertikalen Windgeschwindigkeitsverteilung über Wellenbergen und Wellent?lern und in der Tatsache, da? die Windprofilmessungen im wesentlichen nur das Windprofil über den Bergen widerspiegeln. Der Versuch, die Windprofilmessungen zu „reduzieren“, ist mit zu gro?en Fehlern behaftet, zeigt aber qualitativ eine systematische Erh?hung der aus Windmessungen berechneten Reibungsfaktoren an. Damit kann der Anschlu? an die aus Windstaubeobachtungen ermittelten Reibungsfaktoren gewonnen werden, so da? der vermeintliche „Sprung“ von einer „glatten“ zu einer „rauhen“ Grenzfl?che verschwindet, und eine „kritische Windgeschwindigkeit“, die diesen Sprung angeben soll, gar nicht existiert.
On the problem of the “critical wind velocity” at the air — sea boundary surface
Summary From the measurements of wind profiles at fixed heights above the wave crests no reliable conclusions can be drawn with regard to the wind's effective shearing force at the sea surface. This is essentially due to the diversified vertical distribution of the wind speed above the waves' crests and troughs as well as to the fact that when measuring wind profiles above the waves it is mainly the profiles above the wave crests that are obtained. Attempts to “reduce” wind profiles have proved to be subject to considerable errors; however, they show qualitatively a systematic increase of the frictional factors as computed from wind measurements. This links up with the frictional factors ascertained from observations of the wind effect so that the supposed “leap” from a “smooth” to a “rough” boundary surface and the “critical wind speed” corresponding to this leap do not exist at all.
  相似文献   

7.
Global Seismic Hazard Assessment Program maps are erroneous   总被引:6,自引:0,他引:6  
The March 11, 2011 megathrust on the Pacific coast of the Tohoku Region, Japan, and its consequences once again confirmed the presence of evident problems in the conventional methodology of risk and earthquake loss evaluation. A systematic analysis shows that the results of the Global Seismic Hazard Assessment Program (GSHAP, 1992–1999) contradict the actual occurrence of strong earthquakes. In particular, since the publication of the GSHAP final results in 1999, all 60 earthquakes with magnitudes of 7.5 or higher were “surprises” for the GSHAP maps. Moreover, in half of the cases they were “big surprises,” when instead of the expected “light” or “moderate,” “significant” or even “total” destruction took place. All twelve of the deadliest earthquakes happened in 2000–2011 (total number of deaths exceeded 700000 people) prove that the GSHAP results, as well as underlying methodologies, are deeply flawed and, evidently, unacceptable for any critical risk assessments entitled to prevent disasters caused by earthquakes.  相似文献   

8.
Tikhonov  A. I.  Russkikh  A. V.  Moralev  G. V.  Golitsyn  M. S.  Vasil'ev  A. V.  Duev  D. S.  Timonova  V. A.  Nikolaev  A. K.  Lemeshko  A. P. 《Water Resources》2004,31(6):673-678
Uranium-isotopic and multi-element hydrogeochemical methods are used to assess the natural pollution of fresh groundwater in horizons under development due to a modern intrusion of hypogene water with increased boron and fluorine contents in zones of old tectonic dislocations. The total index of the abovestandard (>MAC) pollution of groundwater in the region of the city of Kirov (the Kirov area) reaches 15. To reduce the effect of natural groundwater pollutants on the health of population, it is recommended that the groundwater intake should be regulated, purifiers should be used, and prospects revealed by the isotopic-and-hydrogeochemical data should be explored. The feasibility of modeling the groundwater formation and tracing the neotectonic dislocations in geological platforms is shown.  相似文献   

9.
The ultimate solution to anthropogenic air pollution depends on an adjustment and upgrade of industrial and energy structures. Before this process can be completed, reducing the anthropogenic pollutant emissions is an effective measure. This is a problem belonging to “Natural Cybernetics”, i.e., the problem of air pollution control should be solved together with the weather prediction; however, this is very complicated. Considering that heavy air pollution usually occurs in stable weather conditions and that the feedbacks between air pollutants and meteorological changes are insufficient, we propose a simplified natural cybernetics method. Here, an off-line air pollution evolution equation is first solved with data from a given anthropogenic emission inventory under the predicted weather conditions, and then, a related “incomplete adjoint problem” is solved to obtain the optimal reduction of anthropogenic emissions. Usually, such solution is sufficient for satisfying the air quality and economical/ social requirements. However, a better solution can be obtained by iteration after updating the emission inventory with the reduced anthropogenic emissions. Then, this paper discusses the retrieval of the pollutant emission source with a known spatio-temporal distribution of the pollutant concentrations, and a feasible mathematical method to achieve this is proposed. The retrieval of emission source would also help control air pollution.  相似文献   

10.
Variation of snow water resources in northwestern China, 1951–1997   总被引:19,自引:0,他引:19  
Two models are used to simulate the high-altitude permafrost distribution on the Qinghai-Xizang Plateau. The two models are the “altitude model”, a Gaussian distribution function used to describe the latitudinal zonation of permafrost based on the three-dimensional rules of high-altitude permafrost, and the “frost number model”, a dimensionless ratio defined by manipulation of freezing and thawing degree-day sums. The results show that the “altitude model” can simulate the high-altitude permafrost distribution under present climate conditions accurately. Given the essential hypotheses and using the GCM scenarios from HADCM2, the “altitude model” is used for predicting the permafrost distribution change on the Qinghai-Xizang Plateau. The results show that the permafrost on the plateau will not change significantly during 20–50 a, the percentage of the total disappeared area will not be over 19%. However, by the year 2099, if the air temperature increases by an average of 2.91°C on the plateau, the decrease in the area of permafrost will exceed 58%—almost all the permafrost in the southern plateau and in the eastern plateau will disappear. Project “Fundamental Research of Cryosphere” supported by the Chinese Academy of Sciences.  相似文献   

11.
Assessing the long-term benefits of marginal improvements in air quality from regulatory intervention is methodologically challenging. In this study, we explore how the relative risks (RRs) of mortality from air pollution exposure change over time and whether patterns in the RRs can be attributed to air quality improvements. We employed two-stage multilevel Cox models to describe the association between air pollution and mortality for 51 cities with data from the American Cancer Society (ACS) cohort (N = 264,299, deaths = 69,819). New pollution data were computed through models that predict yearly average fine particle (PM2.5) concentrations throughout the follow-up (1982–2000). Average PM2.5 concentrations from 1999 to 2000 and sulfate concentrations from 1980 were also examined. We estimated the RRs of mortality associated with air pollution separately for five time periods (1982–1986, 1987–1990, 1991–1994, 1995–1998, and 1999–2000). Mobility models were implemented with a sub-sample of 100,557 subjects to assist with interpreting the RR estimates. Sulfate RRs exhibit a large decline from the 1980s to the 1990s. In contrast, PM2.5 RRs follow the opposite pattern, with larger RRs later in the 1990s. The reduction in sulfate RR may have resulted from air quality improvements that occurred through the 1980s and 1990s in response to the acid rain control program. PM2.5 concentrations also declined in many places, but toxic mobile sources are now the largest contributors to PM in urban areas. This may account for the heightened RR of mortality associated with PM2.5 in the 1990s. The paper concludes with a three alternative explanations for the temporal pattern of RRs, each emphasizing the uncertainty in ascribing health benefits to air quality improvements.  相似文献   

12.
Reliable automatic procedure for locating earthquake in quasi-real time is strongly needed for seismic warning system, earthquake preparedness, and producing shaking maps. The reliability of an automatic location algorithm is influenced by several factors such as errors in picking seismic phases, network geometry, and velocity model uncertainties. The main purpose of this work is to investigate the performances of different automatic procedures to choose the most suitable one to be applied for the quasi-real-time earthquake locations in northwestern Italy. The reliability of two automatic-picking algorithms (one based on the Characteristic Function (CF) analysis, CF picker, and the other one based on the Akaike’s information criterion (AIC), AIC picker) and two location methods (“Hypoellipse” and “NonLinLoc” codes) is analysed by comparing the automatically determined hypocentral coordinates with reference ones. Reference locations are computed by the “Hypoellipse” code considering manually revised data and tested using quarry blasts. The comparison is made on a dataset composed by 575 seismic events for the period 2000–2007 as recorded by the Regional Seismic network of Northwestern Italy. For P phases, similar results, in terms of both amount of detected picks and magnitude of travel time differences with respect to manual picks, are obtained applying the AIC and the CF picker; on the contrary, for S phases, the AIC picker seems to provide a significant greater number of readings than the CF picker. Furthermore, the “NonLinLoc” software (applied to a 3D velocity model) is proved to be more reliable than the “Hypoellipse” code (applied to layered 1D velocity models), leading to more reliable automatic locations also when outliers (wrong picks) are present.  相似文献   

13.
Wind-driven wave heights in the German Bight   总被引:1,自引:0,他引:1  
Wind speed, friction velocity and significant wave height data from the FINO1 platform in the southern German Bight 45 km off the coast for the years 2004 to 2006 have been evaluated and related to each other. The data show a clear dependence of the hourly mean wave height to the hourly mean friction velocity and wind speed. Wave heights increase with decreasing stratification and increasing fetch. Synoptic weather patterns for the highest wave heights in the southern German Bight are determined. The analysis is made separately for four wind direction sectors. The two strongest storms in the evaluated period, “Britta” and “Erwin”, are analysed in more detail. Finally, the 50-year extreme significant wave height has been estimated to be about 11 m most probably coming from northerly directions.  相似文献   

14.
Attempts to build a “constant-stress-drop” scaling of an earthquake-source spectrum have invariably met with difficulties. Physically, such a scaling would mean that the low-frequency content of the spectrum would control the high-frequency one, reducing the number of the parameters governing the time history of a shear dislocation to one. This is technically achieved through relationships of the corner frequency of the spectrum to the fault size, inevitably introduced in an arbitrary manner using a constant termed “stress drop”. Throughout decades of observations, this quantity has never proved to be constant. This fact has fundamental physical reasons. The dislocation motion is controlled by two independent parameters: the final static offset and the speed at which it is reached. The former controls the low-frequency asymptote of the spectrum while the latter its high-frequency content. There is no physical reason to believe that the static displacement should predetermine the slip rate, which would be implied if the “stress drop” were constant. Reducing the two parameters to just one (the seismic moment or magnitude) in a “scaling law” has no strict justification; this would necessarily involve arbitrary assumptions about the relationship of one parameter to the other. This explains why the “constant-stress-drop” scaling in seismology has been believed in but never reconciled with the data.  相似文献   

15.
Studyonthepatternandmodeofverticalcrustaldeformationduringtheseismogenicprocessofintraplatestrongearthquakes杨国华,桂昆长,巩曰沐,杨春花,韩...  相似文献   

16.
Physicochemical and microbiological characteristics of the bathing waters in Varna’s Black Sea coastal area were investigated during year 2007 at 23 monitoring stations. Most of the determined physicochemical parameters—pH, mineral oils, surface active substances, phenols, dissolved oxygen, nutrients and microbiological parameters—“Total coliforms”, “Faecal coliforms”, “Faecal streptococci” were in compliance with the guidline’ limits and exhibited good water quality. Ammonium and phosphate’s pollution above the limits was determined at the South beach, Officers beach and Central beach situated in Varna’s central bathing zone. For the period of 13.08 to 24.10.2007—70% of the South beach samples analyzed for NH4+ exceeded the limits 60 times and the concentrations of PO43− exceeded the limits 17.5 times. Some deviations from the guidline’ limits regarding the microbiology were exhibited at the same beaches. A conclusion is made that the area of study is not yet seriously threatened, in spite of the rapid recreation during the last years.  相似文献   

17.
A series of kinematic inversions based on robust non-linear optimization approach were performed using travel time data from a series of seismic refraction experiments: CELEBRATION 2000, ALP 2002 and SUDETES 2003. These experiments were performed in Central Europe from 2000 to 2003. Data from 8 profiles (CEL09, CEL10, Alp01, S01, S02, S03, S04 and S05) were processed in this study. The goal of this work was to find seismic velocity models yielding travel times consistent with observed data. Optimum 2D inhomogeneous isotropic P-wave velocity models were computed. We have developed and used a specialized two-step inverse procedure. In the first “parametric” step, the velocity model contains interfaces whose shapes are defined by a number of parameters. The velocity along each interface is supposed to be constant but may be different along the upper and lower side of the interface. Linear vertical interpolation is used for points in between interfaces. All parameters are searched for using robust non-linear optimization (Differential Evolution algorithm). Rays are continuously traced by the bending technique. In the second “tomographic” step, small-scale velocity perturbations are introduced in a dense grid covering the currently obtained velocity model. Rays are fixed in this step. Final velocity models yield travel time residuals comparable to typical picking errors (RMS ∼ 0.1 s). As a result, depth-velocity cross-sections of P waves along all processed profiles are obtained. The depth range of the models is 35–50 km, the velocity varies in the range 3.5–8.2 km/s. Lowest velocities are detected in near-surface depth sections crossing sedimentary formations. The middle crust is generally more homogeneous and has typical P wave velocity around 6 km/s. Surprisingly the lower crust is less homogeneous and the computed velocity is in the range 6.5–7.5 km/s. The MOHO is detected in the depth ≈30–45 km.  相似文献   

18.
In this paper, based on the previous study of practical use of seismic regime windows and seismic regime belts, the problem of establishing a “seismic regime network” consisting of “windows” and “belts” is further posed and discussed according to the observed fact that many “windows” and “belts” make responses to one earthquake. For the convenience of usage, the “seismic regime network” is divided into two classes, the first class and the second one. The former can be used in tendency prediction for long-term seismic activity in a large area, the latter used in short-term prediction in a small area. In this paper, after briefly discussing the physical significance of “seismic regime network”, it is pointed out that this simple and easily used method can be used to observe and extract seismic precursory information from a large area before a great earthquake, thus it can provide a reliable basis for the analysis and judgement of seismic regime tendency in time and space. No doult, this method is of certain practical significance. The Chinese version of this paper appeared in the Chinese edition ofActa Seismologica Sinica,13, 161–169, 1991. The English version of this paper is improved by Prof. Shaoxie Xu.  相似文献   

19.
Topsoil (0–20 cm) samples were collected from the edge of roads to the locations about 200 m off the roads along the four roads with different transportation periods in October 2005. Total concentrations of As, Cd, Cr, Cu, Ni, Pb and Zn were determined using the inductively coupled plasma atomic absorption spectrometry in order to assess and compare road transportation pollution. Results showed that with the exception of As, Cu and Pb, the average concentrations of heavy metals were generally, higher than the regional elemental background values. Most soil samples were moderately or highly polluted by Cd or Ni, but the contamination index (P i ) values for As, Pb and Zn were lower than other heavy metals in all sites. Among the four roads, heavy metal pollution was heavier for Dali Road due to longer transportation periods, while low or no contamination could be observed for the other roads. However, the integrated contamination index (P c ) values showed a generally low contamination or no contamination level for all soil samples in this region, followed by the order of Dali Road > Dabao Highway > Road 320 > Sixiao Highway. The same pollution source of these heavy metals was found using factor analysis.  相似文献   

20.
High concentrations of air pollutants in the ambient environment can result in breathing problems with human communities. Effective assessment of health-impact risk from air pollution is important for supporting decisions of the related detection, prevention, and correction efforts. However, the quality of information available for environmental/health risk assessment is often not satisfactory enough to be presented as deterministic numbers. Stochastic method is one of the methods for tackling those uncertainties, by which uncertain information can be presented as probability distributions. However, if the uncertainties can not be presented as probabilities, they can then be handled through fuzzy membership functions. In this study, an integrated fuzzy-stochastic modeling (IFSM) approach is developed for assessing air pollution impacts towards asthma susceptibility. This development is based on Monte Carlo simulation for the fate of SO2 in the ambient environment, examination of SO2 concentrations based on the simulation results, quantification of evaluation criteria using fuzzy membership functions, and risk assessment based on the combined fuzzy-stochastic information. The IFSM entails (a) simulation for the fate of pollutants in ambient environment, with the consideration of source/medium uncertainties, (b) formulation of fuzzy air quality management criteria under uncertain human-exposure pathways, exposure dynamics, and SPG-response variations, and (c) integrated risk assessment under complexities of the combined fuzzy/stochastic inputs of contamination level and health effect (i.e., asthma susceptibility). The developed IFSM is applied to a study of regional air quality management. Reasonable results have been generated, which are useful for evaluating health risks from air pollution. They also provide support for regional environmental management and urban planning.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号