首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   437篇
  免费   15篇
  国内免费   7篇
测绘学   12篇
大气科学   28篇
地球物理   113篇
地质学   182篇
海洋学   32篇
天文学   59篇
综合类   4篇
自然地理   29篇
  2023年   3篇
  2022年   2篇
  2021年   7篇
  2020年   8篇
  2019年   13篇
  2018年   18篇
  2017年   13篇
  2016年   16篇
  2015年   14篇
  2014年   25篇
  2013年   26篇
  2012年   15篇
  2011年   23篇
  2010年   26篇
  2009年   32篇
  2008年   17篇
  2007年   19篇
  2006年   21篇
  2005年   11篇
  2004年   9篇
  2003年   16篇
  2002年   8篇
  2001年   3篇
  2000年   5篇
  1999年   9篇
  1998年   6篇
  1997年   8篇
  1996年   2篇
  1995年   9篇
  1994年   4篇
  1993年   4篇
  1992年   5篇
  1991年   3篇
  1990年   3篇
  1989年   4篇
  1987年   2篇
  1986年   6篇
  1985年   3篇
  1984年   5篇
  1983年   3篇
  1982年   2篇
  1981年   3篇
  1979年   2篇
  1978年   5篇
  1977年   4篇
  1976年   3篇
  1974年   3篇
  1967年   1篇
  1966年   2篇
  1965年   1篇
排序方式: 共有459条查询结果,搜索用时 0 毫秒
41.
Recent studies of the Baltic clam Macoma balthica (L.) from the southern Baltic (the Gulf of Gdansk) have revealed striking morphological, histological and cytogenetic features. Strong deformation of the shell, including elongation of the posterior end and the appearance of an easily visible flexure in this part, has been recorded. The population contribution of the deformed blunt shelled ("irregular") clams ranged from 0% to 65% and tended to increase with depth. The morphologically "irregular" clams had higher accumulated tissue concentrations of trace metals (As, Ag, Cd, Pb, Cu and Zn), indicating a different metal handling ability. Adverse conditions in deeper water regions of the Gulf (e.g. hypoxia, hydrogen sulphide, elevated bioavailability of contaminants) have been suggested as inducers of the phenotypical changes (morphological deformation) in part of the population and, in parallel, of the specific physiological adaptations that result in higher metal accumulation in the "irregular" clams. Cytogenetic and histological analyses showed the presence of tumours in gill cells and digestive system of the affected clams, the prevalence of disseminated neoplasia ranging from 0% to 94% depending on the site. The disease was manifested by a modified karyotype (i.e. an abnormal number and morphology of chromosomes), a higher activity of nucleolar organizer regions (AgNORs), and tissue lesions (enlarged cells, actively proliferative with pleomorphic nuclei). Bottom sediments showed acute toxicity and have been proposed as a source of an initialising carcinogenic factor. However, none of the ecotoxicological studies provided was successful in the clear demonstration of a single (or multifactorial) agent that can account for the disseminated neoplasia.  相似文献   
42.
This is the third paper of a series of papers where we explore the evolution of iron-rich ejecta from quark-novae. In the first paper, we explored the case where a quark-nova ejecta forms a degenerate shell, supported by the star's magnetic field, with applications to SGRs. In the second paper, we considered the case where the ejecta would have sufficient angular momentum to form a degenerate Keplerian torus and applied such a system to two AXPs, namely 1E2259+586 and 4U0142+615. Here, we explore the late evolution of the degenerate torus and find that it can remain unchanged for  ∼106 yr  before it becomes non-degenerate. This transition from a degenerate torus (accretion-dominated) to a non-degenerate disc (no accretion) occurs about 106 yr following the quark-nova, and exhibits features that are reminiscent of observed properties of Rotation RAdio Transients (RRATs). Using this model, we can account for the duration of both the radio bursts and the quiet phase, as well as the observed radio flux from RRATs. We discuss a connection between XDINs and RRATs and argue that some XDINs may be 'dead RRATs' that have already consumed their non-degenerate disc.  相似文献   
43.
Luciola is a large (1 km) “multi-aperture densified-pupil imaging interferometer”, or “hypertelescope” employing many small apertures, rather than a few large ones, for obtaining direct snapshot images with a high information content. A diluted collector mirror, deployed in space as a flotilla of small mirrors, focuses a sky image which is exploited by several beam-combiner spaceships. Each contains a “pupil densifier” micro-lens array to avoid the diffractive spread and image attenuation caused by the small sub-apertures. The elucidation of hypertelescope imaging properties during the last decade has shown that many small apertures tend to be far more efficient, regarding the science yield, than a few large ones providing a comparable collecting area. For similar underlying physical reasons, radio-astronomy has also evolved in the direction of many-antenna systems such as the proposed Low Frequency Array having “hundreds of thousands of individual receivers”. With its high limiting magnitude, reaching the m v?=?30 limit of HST when 100 collectors of 25 cm will match its collecting area, high-resolution direct imaging in multiple channels, broad spectral coverage from the 1,200 Å ultra-violet to the 20 μm infra-red, apodization, coronagraphic and spectroscopic capabilities, the proposed hypertelescope observatory addresses very broad and innovative science covering different areas of ESA’s Cosmic Vision program. In the initial phase, a focal spacecraft covering the UV to near IR spectral range of EMCCD photon-counting cameras (currently 200 to 1,000 nm), will image details on the surface of many stars, as well as their environment, including multiple stars and clusters. Spectra will be obtained for each resel. It will also image neutron star, black-hole and micro-quasar candidates, as well as active galactic nuclei, quasars, gravitational lenses, and other Cosmic Vision targets observable with the initial modest crowding limit. With subsequent upgrade missions, the spectral coverage can be extended from 120 nm to 20 μm, using four detectors carried by two to four focal spacecraft. The number of collector mirrors in the flotilla can also be increased from 12 to 100 and possibly 1,000. The imaging and spectroscopy of habitable exoplanets in the mid infra-red then becomes feasible once the collecting area reaches 6 m2, using a specialized mid infra-red focal spacecraft. Calculations (Boccaletti et al., Icarus 145, 628–636, 2000) have shown that hypertelescope coronagraphy has unequalled sensitivity for detecting, at mid infra-red wavelengths, faint exoplanets within the exo-zodiacal glare. Later upgrades will enable the more difficult imaging and spectroscopy of these faint objects at visible wavelengths, using refined techniques of adaptive coronagraphy (Labeyrie and Le Coroller 2004). Together, the infra-red and visible spectral data carry rich information on the possible presence of life. The close environment of the central black-hole in the Milky Way will be imageable with unprecedented detail in the near infra-red. Cosmological imaging of remote galaxies at the limit of the known universe is also expected, from the ultra-violet to the near infra-red, following the first upgrade, and with greatly increasing sensitivity through successive upgrades. These areas will indeed greatly benefit from the upgrades, in terms of dynamic range, limiting complexity of the objects to be imaged, size of the elementary “Direct Imaging Field”, and limiting magnitude, approaching that of an 8-m space telescope when 1,000 apertures of 25 cm are installed. Similar gains will occur for addressing fundamental problems in physics and cosmology, particularly when observing neutron stars and black holes, single or binary, including the giant black holes, with accretion disks and jets, in active galactic nuclei beyond the Milky Way. Gravitational lensing and micro-lensing patterns, including time-variable patterns and perhaps millisecond lensing flashes which may be beamed by diffraction from sub-stellar masses at sub-parsec distances (Labeyrie, Astron Astrophys 284, 689, 1994), will also be observable initially in the favourable cases, and upgrades will greatly improve the number of observable objects. The observability of gravitational waves emitted by binary lensing masses, in the form of modulated lensing patterns, is a debated issue (Ragazzoni et al., MNRAS 345, 100–110, 2003) but will also become addressable observationally. The technology readiness of Luciola approaches levels where low-orbit testing and stepwise implementation will become feasible in the 2015–2025 time frame. For the following decades beyond 2020, once accurate formation flying techniques will be mastered, much larger hypertelescopes such as the proposed 100 km Exo-Earth Imager and the 100,000 km Neutron Star Imager should also become feasible. Luciola is therefore also seen as a precursor toward such very powerful instruments.  相似文献   
44.
High resolution δ13C and δ18O profiles recorded in precisely dated speleothems are widely used proxies for the climate of the past. Both δ13C and δ18O depend on several climate related effects including meteorological processes, processes occurring in the soil zone above the cave and isotope fractionation processes occurring in the solution layer on the stalagmite surface. Here we model the latter using a stalagmite isotope and growth model and determine the relationship between the stable isotope values in speleothem calcite and cave parameters, such as temperature, drip interval, water pCO2 and a mixing coefficient describing mixing processes between the solution layer and the impinging drop.The evolution of δ13C values is modelled as a Rayleigh distillation process and shows a pronounced dependence on the residence time of the solution on the stalagmite surface and the drip interval, respectively. The evolution of δ18O values, in contrast, is also influenced by buffering reactions between the bicarbonate in the solution and the drip water driving the δ18O value of the bicarbonate towards the value expected for equilibrium isotope fractionation between drip water and calcite. This attenuates the dependence of the δ18O values on drip interval. The temperature dependence of δ18O, however, is more pronounced than for δ13C and in a similar range as expected for fractionation under equilibrium conditions.We also investigate the isotopic enrichment of the δ13C and δ18O values along individual growth layers and, thus, the slopes expected for Hendy tests. The results show that a positive Hendy test is only possible if isotope fractionation occurred under disequilibrium conditions. However, a negative Hendy test does not exclude that isotope fractionation occurred under disequilibrium conditions. A more reliable indicator for disequilibrium fractionation is the enrichment of the δ13C values along an individual growth layer.  相似文献   
45.
46.
Equilibrium relationships are defined between stream waters and weathering products, kaolinite and calcium montmorillonite, for the Rio Tanama system, west-central Puerto Rico. The major element composition of 46 water samples of springs and streams define a reaction path in the system CaO-Na2O-MgO-Al2O3-SiO2-H2O between acid waters containing low concentrations of alkali cations and detrital reactant minerals. The principal reactant phases appear to be chlorite, plagioclase and orthoclase and occasionally anhydrite or calcite. Headward erosion by the Rio Tanama supplies the reactant phases to the stream silt load.The chemical denudation rate calculated for the Rio Tanama system is about 30 m/million yr. The chemical stream load appears to be buffered by the product phases in the main river over the 15–20 km river length sampled in this study.The silt and soil mineralogy and water compositions are used to define a log K for the hydrolysis of Ca-montmorillonite at 25°C of 35.0 ± 0.8. This value is in reasonable agreement with the value of 37.1 ± 1.0 defined by Garrels and Mackenzie (1967) in a similar manner for spring waters in the Sierra Nevada.  相似文献   
47.
A new approach is described to allow conditioning to both hard data (HD) and soft data for a patch- and distance-based multiple-point geostatistical simulation. The multinomial logistic regression is used to quantify the link between HD and soft data. The soft data is converted by the logistic regression classifier into as many probability fields as there are categories. The local category proportions are used and compared to the average category probabilities within the patch. The conditioning to HD is obtained using alternative training images and by imposing large relative weights to HD. The conditioning to soft data is obtained by measuring the probability–proportion patch distance. Both 2D and 3D cases are considered. Synthetic cases show that a stationary TI can generate non-stationary realizations reproducing the HD, keeping the texture indicated by the TI and following the trends identified in probability maps obtained from soft data. A real case study, the Mallik methane-hydrate field, shows perfect reproduction of HD while keeping a good reproduction of the TI texture and probability trends.  相似文献   
48.
The production and use of nanomaterials will inevitably lead to their disposal in the natural environment. To assess the risk that these materials pose to human and ecosystem health an understanding of their mobility and ultimate fate is essential. To date, however, relatively little research has been conducted on the fate of nanoparticles in subsurface systems. In this study the subsurface mobility of two carbon nanoparticles: nano-fullerenes (nC60) and multi-walled carbon nanotubes (MWCNTs) is assessed. A two-dimensional finite element model was used to simulate the movement of these nanoparticles under a range of hydrologic and geological conditions, including a heterogeneous permeability field. The numerical model is based on colloid filtration theory (CFT) with a maximum retention capacity term. For the conditions evaluated the carbon nanotubes are much more mobile than nC60 due to the smaller collector efficiency associated with carbon nanotubes. However, the mobility of nC60 increased significantly when a maximum retention capacity term was included in the model. Model results also demonstrate that, for the systems examined, nanoparticles were predicted to be less mobile in heterogeneous systems compared to the homogeneous systems with the same average hydraulic properties.  相似文献   
49.
Atom probe microscopy (APM) is a relatively new in situ tool for measuring isotope fractions from nanoscale volumes (< 0.01 μm3). We calculate the theoretical detectable difference of an isotope ratio measurement result from APM using counting statistics of a hypothetical data set to be ± 4δ or 0.4% (2s). However, challenges associated with APM measurements (e.g., peak ranging, hydride formation and isobaric interferences), result in larger uncertainties if not properly accounted for. We evaluate these factors for Re‐Os isotope ratio measurements by comparing APM and negative thermal ionisation mass spectrometry (N‐TIMS) measurement results of pure Os, pure Re, and two synthetic Re‐Os‐bearing alloys from Schwander et al. (2015, Meteoritics and Planetary Science, 50, 893) [the original metal alloy (HSE) and alloys produced by heating HSE within silicate liquid (SYN)]. From this, we propose a current best practice for APM Re‐Os isotope ratio measurements. Using this refined approach, mean APM and N‐TIMS 187Os/189Os measurement results agree within 0.05% and 2s (pure Os), 0.6–2% and 2s (SYN) and 5–10% (HSE). The good agreement of N‐TIMS and APM 187Os/189Os measurements confirms that APM can extract robust isotope ratios. Therefore, this approach permits nanoscale isotope measurements of Os‐bearing alloys using the Re‐Os geochronometer that could not be measured by conventional measurement principles.  相似文献   
50.
Generalized cross-validation for covariance model selection   总被引:4,自引:0,他引:4  
A weighted cross-validation technique known in the spline literature as generalized cross-validation (GCV), is proposed for covariance model selection and parameter estimation. Weights for prediction errors are selected to give more importance to a cluster of points than isolated points. Clustered points are estimated better by their neighbors and are more sensitive to model parameters. This rational weighting scheme also provides a simplifying significantly the computation of the cross-validation mean square error of prediction. With small- to medium-size datasets, GCV is performed in a global neighborhood. Optimization of usual isotropic models requires only a small number of matrix inversions. A small dataset and a simulation are used to compare performances of GCV to ordinary cross-validation (OCV) and least-squares filling (LS).  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号