首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
Chlororespiration is common in shallow aquifer systems under conditions nominally identified as anoxic. Consequently, chlororespiration is a key component of remediation at many chloroethene‐contaminated sites. In some instances, limited accumulation of reductive dechlorination daughter products is interpreted as evidence that natural attenuation is not adequate for site remediation. This conclusion is justified when evidence for parent compound (tetrachloroethene, PCE, or trichloroethene, TCE) degradation is lacking. For many chloroethene‐contaminated shallow aquifer systems, however, nonconservative losses of the parent compounds are clear but the mass balance between parent compound attenuation and accumulation of reductive dechlorination daughter products is incomplete. Incomplete mass balance indicates a failure to account for important contaminant attenuation mechanisms and is consistent with contaminant degradation to nondiagnostic mineralization products like CO2. While anoxic mineralization of chloroethene compounds has been proposed previously, recent results suggest that oxygen‐based mineralization of chloroethenes also can be significant at dissolved oxygen concentrations below the currently accepted field standard for nominally anoxic conditions. Thus, reassessment of the role and potential importance of low concentrations of oxygen in chloroethene biodegradation are needed, because mischaracterization of operant biodegradation processes can lead to expensive and ineffective remedial actions. A modified interpretive framework is provided for assessing the potential for chloroethene biodegradation under different redox conditions and the probable role of oxygen in chloroethene biodegradation.  相似文献   

2.
Like tree rings, high‐resolution soil sampling of low‐permeability (low‐k) zones can be used to evaluate the style of source history at contaminated sites (i.e., historical pattern of concentration and composition vs. time since releases occurred at the interface with the low‐k zone). This is valuable for the development of conceptual site model (CSM) and can serve as an important line of evidence supporting monitored natural attenuation (MNA) as a long‐term remedy. Source histories were successfully reconstructed at two sites at Naval Air Station Jacksonville using a simple one‐dimensional (1D) model. The plume arrival time and historical composition were reconstructed from the time initial releases that were suspected to occur decades earlier. At the first site (Building 106), the source reconstructions showed relatively constant source concentrations, but significant attenuation over time in the downgradient plume in the transmissive zone, suggesting MNA may not be an appropriate remedy if source control is a requirement, but attenuation processes are clearly helping to maintain plume stability and reduce risk. At the second site (Building 780), source concentrations in the transmissive zone showed an approximately a one order of magnitude over time, but apparently less attenuation in the downgradient plume. The source reconstruction method appeared to reflect site remediation efforts (excavation, soil vapor extraction) implemented in the 1990s. Finally, a detailed analysis using molecular biological tools, carbon isotopes, and by‐products suggests that most degradation activity is associated with high‐k zones but not with low‐k zones at these source areas. Overall, the source reconstruction methodology provided insight into historical concentration trends not obtainable otherwise given the limited long‐term monitoring data.  相似文献   

3.
Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling‐based approaches are expensive and provide low‐density spatial and temporal information. Time‐lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation‐related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling‐based approaches for assessing emplacement and monitoring biostimulation‐based remediation. Field studies demonstrating the ability of time‐lapse ERT to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment‐related geochemical properties. Crosshole radar zero‐offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation. In addition to delineating amendment transport in the first 10 months after emplacement, the time‐lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost‐effective surface‐based ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance.  相似文献   

4.
Pump‐and‐treat systems can prevent the migration of groundwater contaminants and candidate systems are typically evaluated with groundwater models. Such models should be rigorously assessed to determine predictive capabilities and numerous tools and techniques for model assessment are available. While various assessment methodologies (e.g., model calibration, uncertainty analysis, and Bayesian inference) are well‐established for groundwater modeling, this paper calls attention to an alternative assessment technique known as screening‐level sensitivity analysis (SLSA). SLSA can quickly quantify first‐order (i.e., main effects) measures of parameter influence in connection with various model outputs. Subsequent comparisons of parameter influence with respect to calibration vs. prediction outputs can suggest gaps in model structure and/or data. Thus, while SLSA has received little attention in the context of groundwater modeling and remedial system design, it can nonetheless serve as a useful and computationally efficient tool for preliminary model assessment. To illustrate the use of SLSA in the context of designing groundwater remediation systems, four SLSA techniques were applied to a hypothetical, yet realistic, pump‐and‐treat case study to determine the relative influence of six hydraulic conductivity parameters. Considered methods were: Taguchi design‐of‐experiments (TDOE); Monte Carlo statistical independence (MCSI) tests; average composite scaled sensitivities (ACSS); and elementary effects sensitivity analysis (EESA). In terms of performance, the various methods identified the same parameters as being the most influential for a given simulation output. Furthermore, results indicate that the background hydraulic conductivity is important for predicting system performance, but calibration outputs are insensitive to this parameter (KBK). The observed insensitivity is attributed to a nonphysical specified‐head boundary condition used in the model formulation which effectively “staples” head values located within the conductivity zone. Thus, potential strategies for improving model predictive capabilities include additional data collection targeting the KBK parameter and/or revision of model structure to reduce the influence of the specified head boundary.  相似文献   

5.
A crude‐oil spill occurred in 1979 when a pipeline burst near Bemidji, MN. In 1998, the pipeline company installed a dual‐pump recovery system designed to remove crude oil remaining in the subsurface at the site. The remediation from 1999 to 2003 resulted in removal of about 115,000 L of crude oil, representing between 36% and 41% of the volume of oil (280,000 to 316,000 L) estimated to be present in 1998. Effects of the 1999 to 2003 remediation on the dissolved plume were evaluated using measurements of oil thicknesses in wells plus measurements of dissolved oxygen in groundwater. Although the recovery system decreased oil thicknesses in the immediate vicinity of the remediation wells, average oil thicknesses measured in wells were largely unaffected. Dissolved‐oxygen measurements indicate that a secondary plume was caused by disposal of the pumped water in an upgradient infiltration gallery; this plume expanded rapidly immediately following the start of the remediation in 1999. The result was expansion of the anoxic zone of groundwater upgradient and beneath the existing natural attenuation plume. Oil‐phase recovery at this site was shown to be challenging, and considerable volumes of mobile and entrapped oil remain in the subsurface despite remediation efforts.  相似文献   

6.
ZVI‐Clay is an emerging remediation approach that combines zero‐valent iron (ZVI)‐mediated degradation and in situ stabilization of chlorinated solvents. Through use of in situ soil mixing to deliver reagents, reagent‐contaminant contact issues associated with natural subsurface heterogeneity are overcome. This article describes implementation, treatment performance, and reaction kinetics during the first year after application of the ZVI‐Clay remediation approach at Marine Corps Base Camp Lejeune, North Carolina. Primary contaminants included trichloroethylene, 1,1,2,2‐tetrachloroethane, and related natural degradation products. For the field application, 22,900 m3 of soils were treated to an average depth of 7.6 m with 2% ZVI and 3% sodium bentonite (dry weight basis). Performance monitoring included analysis of soil and water samples. After 1 year, total concentrations of chlorinated volatile organic compounds (CVOCs) in soil samples were decreased by site‐wide average and median values of 97% and >99%, respectively. Total CVOC concentrations in groundwater were reduced by average and median values of 81% and >99%, respectively. In several of the soil and groundwater monitoring locations, reductions in total CVOC concentrations of greater than 99.9% were apparent. Further reduction in concentrations of chlorinated solvents is expected with time. Pre‐ and post‐mixing average hydraulic conductivity values were 1.7 × 10?5 and 5.2 × 10?8 m/s, respectively, indicating a reduction of about 2.5 orders of magnitude. By achieving simultaneous contaminant mass depletion and hydraulic conductivity reduction, contaminant flux reductions of several orders of magnitude are predicted.  相似文献   

7.
Pre‐ and post‐remediation data sets are used herein to assess the effectiveness of remedial measures implemented in the headwaters of the Mineral Creek watershed, where contamination from hard rock mining has led to elevated metal concentrations and acidic pH. Collection of pre‐ and post‐remediation data sets generally followed the synoptic mass balance approach, in which numerous stream and inflow locations are sampled for the constituents of interest and estimates of streamflow are determined by tracer dilution. The comparison of pre‐ and post‐remediation data sets is confounded by hydrologic effects and the effects of temporal variation. Hydrologic effects arise due to the relatively wet conditions that preceded the collection of pre‐remediation data, and the relatively dry conditions associated with the post‐remediation data set. This difference leads to a dilution effect in the upper part of the study reach, where pre‐remediation concentrations were diluted by rainfall, and a source area effect in the lower part of the study reach, where a smaller portion of the watershed may have been contributing constituent mass during the drier post‐remediation period. A second confounding factor, temporal variability, violates the steady‐state assumption that underlies the synoptic mass balance approach, leading to false identification of constituent sources and sinks. Despite these complications, remedial actions completed in the Mineral Creek headwaters appear to have led to improvements in stream water quality, as post‐remediation profiles of instream load are consistently lower than the pre‐remediation profiles over the entire study reach for six of the eight constituents considered (aluminium, arsenic, cadmium, copper, iron, and zinc). Concentrations of aluminium, cadmium, copper, lead, and zinc remain above chronic aquatic‐life standards, however, and additional remedial actions may be needed. Future implementations of the synoptic mass balance approach should be preceded by an assessment of temporal variability, and modifications to the synoptic sampling protocol should be made if necessary. Published in 2009 by John Wiley & Sons, Ltd.  相似文献   

8.
Sustainability and resilience are issues that are recognized worldwide, and increased attention should be placed on strategies to design and maintain infrastructure systems that are hazard resilient, damage tolerant, and sustainable. In this paper, a methodology to evaluate the seismic sustainability and resilience of both conventional and base‐isolated steel buildings is presented. Furthermore, the proposed approach is used to explore the difference between the performance associated with these buildings by considering the three pillars of sustainability: economic, social, and environmental. Sustainability and resilience are both considered to cover a comprehensive performance‐based assessment content. The uncertainties associated with performance and consequence evaluation of structural and non‐structural components are incorporated within the assessment process. The proposed performance‐based assessment approach is illustrated on conventional and base‐isolated steel buildings under given seismic scenarios. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

9.
Marine seismic data are always affected by noise. An effective method to handle a broad range of noise problems is a time‐frequency de‐noising algorithm. In this paper we explain details regarding the implementation of such a method. Special emphasis is given to the choice of threshold values, where several different strategies are investigated. In addition we present a number of processing results where time‐frequency de‐noising has been successfully applied to attenuate noise resulting from swell, cavitation, strumming and seismic interference. Our seismic interference noise removal approach applies time‐frequency de‐noising on slowness gathers (τ?p domain). This processing trick represents a novel approach, which efficiently handles certain types of seismic interference noise that otherwise are difficult to attenuate. We show that time‐frequency de‐noising is an effective, amplitude preserving and robust tool that gives superior results compared to many other conventional de‐noising algorithms (for example frequency filtering, τ?p or fx‐prediction). As a background, some of the physical mechanisms responsible for the different types of noise are also explained. Such physical understanding is important because it can provide guidelines for future survey planning and for the actual processing.  相似文献   

10.
Although all of the main properties of a ground motion cannot be captured through a single parameter, a number of different engineering parameters has been proposed that are able to reflect either one or more ground‐motion characteristics concurrently. For many of these parameters, especially regarding Greece, there are relatively few or no predictive models. In this context, we present a set of new regionally‐calibrated equations for the prediction of the geometric mean of the horizontal components of 10 amplitude‐, frequency response‐, and duration‐based parameters for shallow crustal earthquakes. These equations supersede previous empirical relationships for Greece since their applicability range for magnitude, and epicentral distance has been extended down to Mw 4 and up to 200 km, respectively, the incorporation of a term accounting for anelastic attenuation has been investigated, while their development was based on a ground‐motion dataset spanning from 1973 to 2014. For all ground‐motion parameters, we provide alternative optimal equations relative to the availability of information on the different explanatory variables. In all velocity‐based and contrary to the acceleration‐based parameters, the anelastic attenuation coefficient was found statistically insignificant when it was combined with the geometric decay and the coefficient accounting for saturation with distance. In the regressions where the geometric decay coefficient simultaneously incorporated the contribution of anelastic attenuation, its increase was found to be much less considerable in the velocity‐based than in the acceleration‐based parameters, implying a stronger effect of anelastic attenuation on the parameters that are defined via the acceleration time history.  相似文献   

11.
12.
The fate of hydrocarbons in the subsurface near Bemidji, Minnesota, has been investigated by a multidisciplinary group of scientists for over a quarter century. Research at Bemidji has involved extensive investigations of multiphase flow and transport, volatilization, dissolution, geochemical interactions, microbial populations, and biodegradation with the goal of providing an improved understanding of the natural processes limiting the extent of hydrocarbon contamination. A considerable volume of oil remains in the subsurface today despite 30 years of natural attenuation and 5 years of pump‐and‐skim remediation. Studies at Bemidji were among the first to document the importance of anaerobic biodegradation processes for hydrocarbon removal and remediation by natural attenuation. Spatial variability of hydraulic properties was observed to influence subsurface oil and water flow, vapor diffusion, and the progression of biodegradation. Pore‐scale capillary pressure‐saturation hysteresis and the presence of fine‐grained sediments impeded oil flow, causing entrapment and relatively large residual oil saturations. Hydrocarbon attenuation and plume extent was a function of groundwater flow, compound‐specific volatilization, dissolution and biodegradation rates, and availability of electron acceptors. Simulation of hydrocarbon fate and transport affirmed concepts developed from field observations, and provided estimates of field‐scale reaction rates and hydrocarbon mass balance. Long‐term field studies at Bemidji have illustrated that the fate of hydrocarbons evolves with time, and a snap‐shot study of a hydrocarbon plume may not provide information that is of relevance to the long‐term behavior of the plume during natural attenuation.  相似文献   

13.
This paper presents the results of a probabilistic evaluation of the seismic performance of 3D steel moment‐frame structures. Two types of framing system are considered: one‐way frames typical of construction in the United States and two‐way frames typical of construction in Japan. For each framing system, four types of beam–column connections are considered: pre‐Northridge welded‐flange bolted‐web, post‐Northridge welded‐flange welded‐web, reduced‐beam‐section, and bolted‐flange‐plate connections. A suite of earthquake ground motions is used to compute the annual probability of exceedence (APE) for a series of drift demand levels and for member plastic‐rotation capacity. Results are compared for the different framing systems and connection details. It is found that the two‐way frames, which have a larger initial stiffness and strength than the one‐way frames for the same beam and column volumes, have a smaller APE for small drift demands for which members exhibit no or minimal yielding, but have a larger APE for large drift demands for which members exhibit large plastic rotations. However, the one‐way frames, which typically comprise a few seismic frames with large‐sized members that have relatively small rotation capacities, may have a larger APE for member failure. The probabilistic approach presented in this study may be used to determine the most appropriate frame configuration to meet an owner's performance objectives. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

14.
Diminishing rates of subsurface volatile contaminant removal by soil vapor extraction (SVE) oftentimes warrants an in-depth performance assessment to guide remedy decision-making processes. Such a performance assessment must include quantitative approaches to better understand the impact of remaining vadose zone contamination on soil gas and groundwater concentrations. The spreadsheet-based Soil Vapor Extraction Endstate Tool (SVEET) software functionality has recently been expanded to facilitate quantitative performance assessments. The updated version, referred to as SVEET2, includes expansion of the input parameter ranges for describing a site (site geometry, source characteristics, etc.), an expanded list of contaminants, and incorporation of elements of the Vapor Intrusion Estimation Tool for Unsaturated-zone Sources software to provide soil gas concentration estimates for use in vapor intrusion evaluation. As part of the update, SVEET2 was used to estimate the impact of a tetrachloroethene (PCE) vadose zone source on groundwater concentrations, comparing SVEET2 results to field-observed values at an undisclosed site where SVE was recently terminated. PCE concentrations from three separate monitoring wells were estimated by SVEET2 to be within the range of 6.0–6.7 μg/L, as compared to actual field concentrations that ranged from 3 to 11 μg/L PCE. These data demonstrate that SVEET2 can rapidly provide representative quantitative estimates of impacts from a vadose zone contaminant source at field sites. In the context of the SVE performance assessment, such quantitative estimates provide a basis to support remedial and/or regulatory decisions regarding the continued need for vadose zone volatile organic compound remediation or technical justification for SVE termination, which can significantly reduce the cost to complete for a site.  相似文献   

15.
The American Society of Civil Engineers (ASCE) 43‐05 presents two performance objectives for the design of nuclear structures, systems and components in nuclear facilities: (1) 1% probability of unacceptable performance for 100% design basis earthquake (DBE) shaking and (2) 10% probability of unacceptable performance for 150% DBE shaking. To aid in the revision of the ASCE 4‐98 procedures for the analysis and design of base‐isolated nuclear power plants and meet the intent of ASCE 43‐05, a series of nonlinear response‐history analyses was performed to study the impact of the variability in both earthquake ground motion and mechanical properties of isolation systems on the seismic responses of base‐isolated nuclear power plants. Computations were performed for three representative sites (rock and soil sites in the Central and Eastern United States and a rock site in the Western United States) and three types of isolators (lead rubber, Friction Pendulum and low‐damping rubber bearings) using realistic mechanical properties for the isolators. Estimates were made of (1) the ratio of the 99th percentile (90th percentile) response of isolation systems computed using a distribution of spectral demands and distributions of isolator mechanical properties to the median response of isolation systems computed using best‐estimate properties and 100% (150%) spectrum‐compatible DBE ground motions; (2) the number of sets of three‐component ground motions to be used for response‐history analysis to develop a reliable estimate of the median response of isolation systems. The results of this study provide the technical basis for the revision of ASCE Standard 4‐98. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

16.
Substructure hybrid simulation has been the subject of numerous investigations in recent years. The simulation method allows for the assessment of the seismic performance of structures by representing critical components with physical specimens and the rest of the structure with numerical models. In this study the system level performance of a six‐storey structure with telescoping self‐centering energy dissipative (T‐SCED) braces is validated through pseudo‐dynamic (PsD) hybrid simulation. Fragility curves are derived for the T‐SCED system. This paper presents the configuration of the hybrid simulation, the newly developed control software for PsD hybrid simulation, which can integrate generic hydraulic actuators into PsD hybrid simulation, and the seismic performance of a structure equipped with T‐SCED braces. The experimental results show that the six‐storey structure with T‐SCED braces satisfies performance limits specified in ASCE 41. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

17.
Experiences at five pump-and-treatment (P&T) facilities provide three important lessons:
1. To reduce future costs, it is important to use the best hydraulic information possible for designing P&T systems, and to incorporate flexibility to compensate for uncertainties in hydraulic conditions. A phased approach to P&T system construction and hydraulic testing has been successful.
2. In practice, downtimes and maintenance problems result in significant reductions in the average ground water extraction rates. This indicates that operation and maintenance were more difficult than expected and warrant more attention. Furthermore, P&T systems are generally designed to attain the model optimized extraction rates only when the system is in full operation. Designers must recognize that P&T systems have downtimes, and, therefore, the uptime pumping must be sufficient to maintain capture. Generally, P&T systems should achieve model-optimized extraction rates on an average basis rather than only when the system is in full operation.
3. During operation of P&T systems, the average extraction rates should be routinely correlated to capture zone evaluations and included in monitoring reports. The average extraction rates should be compared to the model-derived extraction rates to assess whether the design objectives are being met and should be included in routine monitoring reports to confirm maintenance of pumping rates under which capture has been demonstrated.  相似文献   

18.
Centralized semi‐active control is a technique for controlling the whole structure using one main computer. Centralized control systems introduce better control for relatively short to medium high structures where the response of any story cannot be separated from the adjacent ones. In this paper, two centralized control approaches are proposed for controlling the seismic response of post‐tensioned (PT) steel frames. The first approach, the stiffness control approach, aims to alter the stiffness of the PT frame so that it avoids large dynamic amplifications due to earthquake excitations. The second approach, deformation regulation control approach, aims at redistributing the demand/strength ratio in order to provide a more uniform distribution of deformations over the height of the structure. The two control approaches were assessed through simulations of the earthquake response of semi‐actively and passively controlled six‐story post‐tensioned steel frames. The results showed that the stiffness control approach is efficient in reducing the frame deformations and internal forces. The deformation regulation control approach was found to be efficient in reducing the frame displacements and generating a more uniform distribution of the inter‐story drifts. These results indicate that centralized semi‐active control can be used to improve the seismic performance of post‐tensioned steel frames. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

19.
Many studies indicate that small‐scale heterogeneity and/or mobile–immobile mass exchange produce transient non‐Fickian plume behavior that is not well captured by the use of the standard, deterministic advection‐dispersion equation (ADE). An extended ADE modeling framework is presented here that is based on continuous time random walk theory. It can be used to characterize non‐Fickian transport coupled with simultaneous sequential first‐order reactions (e.g., biodegradation or radioactive decay) for multiple degrading contaminants such as chlorinated solvents, royal demolition explosive, pesticides, and radionuclides. To demonstrate this modeling framework, new transient analytical solutions are derived and are inverted in Laplace space. Closed‐form, steady‐state, multi‐species analytical solutions are also derived for non‐Fickian transport in highly heterogeneous aquifers with linear sorption–desorption and matrix diffusion for use in spreadsheets. The solutions are general enough to allow different degradation rates for the mobile and immobile zones. The transient solutions for multi‐species transport are applied to examine the effects of source remediation on the natural attenuation of downgradient plumes of both parent and degradation products in highly heterogeneous aquifers. Results for representative settings show that the use of the standard, deterministic ADE can over‐estimate cleanup rates and under‐predict the cleanup timeframe in comparison to the extended ADE analytical model. The modeling framework and calculations introduced here are also applied for a 30 year groundwater cleanup program at a site in Palm Bay, Florida. The simulated plume concentrations using the extended ADE exhibited agreement with observed long concentration tails of trichloroethene, cis 1,2 DCE, and VC that remained above cleanup goals.  相似文献   

20.
Three‐dimensional seismic survey design should provide an acquisition geometry that enables imaging and amplitude‐versus‐offset applications of target reflectors with sufficient data quality under given economical and operational constraints. However, in land or shallow‐water environments, surface waves are often dominant in the seismic data. The effectiveness of surface‐wave separation or attenuation significantly affects the quality of the final result. Therefore, the need for surface‐wave attenuation imposes additional constraints on the acquisition geometry. Recently, we have proposed a method for surface‐wave attenuation that can better deal with aliased seismic data than classic methods such as slowness/velocity‐based filtering. Here, we investigate how surface‐wave attenuation affects the selection of survey parameters and the resulting data quality. To quantify the latter, we introduce a measure that represents the estimated signal‐to‐noise ratio between the desired subsurface signal and the surface waves that are deemed to be noise. In a case study, we applied surface‐wave attenuation and signal‐to‐noise ratio estimation to several data sets with different survey parameters. The spatial sampling intervals of the basic subset are the survey parameters that affect the performance of surface‐wave attenuation methods the most. Finer spatial sampling will reduce aliasing and make surface‐wave attenuation easier, resulting in better data quality until no further improvement is obtained. We observed this behaviour as a main trend that levels off at increasingly denser sampling. With our method, this trend curve lies at a considerably higher signal‐to‐noise ratio than with a classic filtering method. This means that we can obtain a much better data quality for given survey effort or the same data quality as with a conventional method at a lower cost.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号