In this age of modern biology, aquatic toxicological research has pursued mechanisms of action of toxicants. This has provided potential tools for ecotoxicologic investigations. However, problems of biocomplexity and issues at higher levels of biological organization remain a challenge. In the 1980s and 1990s and continuing to a lesser extent today, organisms residing in highly contaminated field sites or exposed in the laboratory to calibrated concentrations of individual compounds were carefully analyzed for their responses to priority pollutants. Correlation of biochemical and structural analyses in cultured cells and tissues, as well as the in vivo exposures led to the production and application of biomarkers of exposure and effect and to our awareness of genotoxicity and its chronic manifestations, such as neoplasms, in wild fishes. To gain acceptance of these findings in the greater environmental toxicology community, “validation of the model” versus other, better-established often rodent models, was necessary and became a major focus. Resultant biomarkers were applied to heavily contaminated and reference field sites as part of effects assessment and with investigations following large-scale disasters such as oil spills or industrial accidents.
Over the past 15 years, in the laboratory, small aquarium fish models such as medaka (Oryzias latipes), zebrafish (Danio rerio), platyfish (Xiphophorus species), fathead minnow (Pimephales promelas), and sheepshead minnow (Cyprinodon variegatus) were increasingly used establishing mechanisms of toxicants. Today, the same organisms provide reliable information at higher levels of biological organization relevant to ecotoxicology. We review studies resolving mechanisms of toxicity and discuss ways to address biocomplexity, mixtures of contaminants, and the need to relate individual level responses to populations and communities. 相似文献
Based on classic iterative computation results, new equations to calculate the surface turbulent transfer coefficients are
proposed, which allow for large ratios of the momentum and heat roughness lengths. Compared to the Launiainen scheme, our
proposed scheme generates results closer to classical iterative computations. Under unstable stratification, the relative
error in the Launiainen scheme increases linearly with increasing instability, even exceeding 15%, while the relative error
of the present scheme is always less than 8.5%. Under stable stratification, the Launiainen scheme uses two equations, one
for 0 < RiB ≤ 0.08 and another for 0.08 < RiB ≤ 0.2, and does not consider the condition that RiB > 0.2, while its relative errors in the region 0 < RiB ≤ 0.2 exceed 31 and 24% for momentum and heat transfer coefficients, respectively. In contrast, the present scheme uses only
one equation for 0 < RiB ≤ 0.2 and another equation for RiB > 0.2, and the relative error of the present scheme is always less than 14%. 相似文献
This article, on the basis of the expert’s analyses and theories of the forecast of heavy-rain, summarizes a variety of the current instruments and methods of forecast and, according to the train of thought of the high-rank forecasters, distills their experiences in forecasting heavy-rain into an inference-tree of 106 junctures from which 101 rule-bases are derived. The logical calculation is automatically carried out with our introduced and developed PROLOG, one of the intelligent languagas by means of micro-computer. This process adopts the uncertain inferential method based on the theory of fuzzy sets, breaks through the limits of two-value logic and is characteristic of the thinking of human brain. 相似文献