首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1155篇
  免费   54篇
  国内免费   10篇
测绘学   45篇
大气科学   65篇
地球物理   327篇
地质学   446篇
海洋学   114篇
天文学   146篇
综合类   6篇
自然地理   70篇
  2023年   9篇
  2022年   7篇
  2021年   19篇
  2020年   22篇
  2019年   25篇
  2018年   48篇
  2017年   50篇
  2016年   63篇
  2015年   37篇
  2014年   66篇
  2013年   77篇
  2012年   52篇
  2011年   76篇
  2010年   58篇
  2009年   91篇
  2008年   57篇
  2007年   46篇
  2006年   39篇
  2005年   50篇
  2004年   32篇
  2003年   35篇
  2002年   32篇
  2001年   15篇
  2000年   21篇
  1999年   13篇
  1998年   12篇
  1997年   13篇
  1996年   7篇
  1995年   11篇
  1994年   10篇
  1993年   11篇
  1992年   4篇
  1991年   3篇
  1990年   9篇
  1989年   7篇
  1988年   3篇
  1987年   5篇
  1986年   4篇
  1985年   7篇
  1984年   8篇
  1983年   7篇
  1982年   5篇
  1981年   3篇
  1979年   4篇
  1957年   3篇
  1956年   4篇
  1955年   3篇
  1953年   3篇
  1950年   3篇
  1949年   2篇
排序方式: 共有1219条查询结果,搜索用时 18 毫秒
961.
962.
With detections of the Sunyaev–Zel'dovich (SZ) effect induced by galaxy clusters becoming routine, it is crucial to establish accurate theoretical predictions. We use a hydrodynamical N -body code to generate simulated maps, of size 1 deg2, of the thermal SZ effect. This is done for three different cosmologies: the currently favoured low-density model with a cosmological constant, a critical-density model and a low-density open model. We stack simulation boxes corresponding to different redshifts in order to include contributions to the Compton y -parameter out to the highest necessary redshifts. Our main results are as follows.
(i) The mean y -distortion is around 4×10−6 for low-density cosmologies, and 1×10−6 for critical density. These are below current limits, but not by a wide margin in the former case.
(ii) In low-density cosmologies, the mean y -distortion is contributed across a broad range of redshifts, with the bulk coming from z ≲2 and a tail out to z ∼5. For critical-density models, most of the contribution comes from z <1.
(iii) The number of SZ sources above a given y depends strongly on instrument resolution. For a 1-arcmin beam, there are around 0.1 sources per deg2 with y >10−5 in a critical-density Universe, and around 8 such sources per deg2 in low-density models. Low-density models with and without a cosmological constant give very similar results.
(iv) We estimate that the Planck satellite will be able to see of order 25 000 SZ sources if the Universe has a low density, or around 10 000 if it has critical density.  相似文献   
963.
Near-infrared linear imaging polarimetry of the young stellar objects R CrA and T CrA in the J , H and K n bands, and circular imaging polarimetry in the H band, is presented. The data are modelled with the Clark and McCall scattering model. The R CrA and T CrA system is shown to be a particularly complex scattering environment. In the case of R CrA there is evidence that the wavelength dependence of polarization changes across the nebula. MRN dust grain models do not explain this behaviour. Depolarization by line emission is considered as an alternative explanation. The dust grain properties could also be changing across the nebula.
Although surrounded by reflection nebulosity, there is a region of particularly low polarization surrounding R CrA that is best modelled by the canonical bipolar outflow being truncated by an evacuated spherical cavity surrounding the star. The symmetry axis of the nebula appears inclined by 50° to the plane of the sky.
The H -band circular polarimetry of R CrA clearly shows a quadrupolar structure of positive and negative degrees of circular polarization that reach peak magnitudes of ∼5 per cent within our limited map. It is shown that spherical MRN grains are incapable of producing this circular polarization given the observed linear polarization of the R CrA system. Instead, scattering from aligned non-spherical grains is proposed as the operating mechanism.
T CrA is a more archetypical bipolar reflection nebula, and this object is modelled as a canonical parabolic reflection nebula that lies in the plane of the sky. The wavelength independence of linear polarization in the T CrA reflection nebula suggests that the scattering particles are Rayleigh sized. This is modelled with the MRN interstellar grain size distribution.  相似文献   
964.
There is currently a lack of well‐characterised matrix‐matched reference materials (RMs) for forensic analysis of U‐rich materials at high spatial resolution. This study reports a detailed characterisation of uraninite (nominally UO2+x) from the Happy Jack Mine (UT, USA). The Happy Jack uraninite can be used as a RM for the determination of rare earth element (REE) mass fractions in nuclear materials, which provide critical information for source attribution purposes. This investigation includes powder X‐ray diffraction (pXRD) data, as well as major, minor and trace element abundances determined using a variety of micro‐analytical techniques. The chemical signature of the uraninite was investigated at the macro (cm)‐scale with micro‐X‐ray fluorescence (µXRF) mapping and at high spatial resolution (tens of micrometre scale) using electron probe microanalysis (EPMA) and laser ablation‐inductively coupled plasma‐mass spectrometry (LA‐ICP‐MS) analyses. Based on EPMA results, the uraninite is characterised by homogeneous UO2 and CaO contents of 91.57 ± 1.49% m/m (2s uncertainty) and 2.70 ± 0.38% m/m (2s), respectively. Therefore, CaO abundances were used as the internal standard when conducting LA‐ICP‐MS analyses. Overall, the major element and REE compositions are homogeneous at both the centimetre and micrometre scales, allowing this material to be used as a RM for high spatial resolution analysis of U‐rich samples.  相似文献   
965.
Testing for spatial association of qualitative data using symbolic dynamics   总被引:3,自引:2,他引:1  
Qualitative spatial variables are important in many fields of research. However, unlike the decades-worth of research devoted to the spatial association of quantitative variables, the exploratory analysis of spatial qualitative variables is relatively less developed. The objective of the present paper is to propose a new test (Q) for spatial independence. This is a simple, consistent, and powerful statistic for qualitative spatial independence that we develop using concepts from symbolic dynamics and symbolic entropy. The Q test can be used to detect, given a spatial distribution of events, patterns of spatial association of qualitative variables in a wide variety of settings. In order to enable hypothesis testing, we give a standard asymptotic distribution of an affine transformation of the symbolic entropy under the null hypothesis of independence in the spatial qualitative process. We include numerical experiments to demonstrate the finite sample behaviour of the test, and show its application by means of an empirical example that explores the spatial association of fast food establishments in the Greater Toronto Area in Canada.  相似文献   
966.
In this paper we compare two estimation methods to deal with samples of different support: (1) the indirect approach using accumulation and (2) kriging with samples of different support. These two methods were tested in a simple example. The estimates of the two methods were compared against a benchmark scenario. The benchmark consisted of kriging using a complete set of samples on the same support. The effects of the nugget effect, variogram range and type on the weight of long samples, the estimate, and the error variance were assessed. Kriging with samples of different support led to lower error variance and to estimates closer to the estimates of the benchmark scenario. Furthermore, in the case of spatially continuous attributes (low nugget effect), the indirect approach assigns greater weight to long samples than kriging with samples of different support. A cross validation study comparing the two methods with a database from a bauxite deposit was performed. The results of the cross validation study showed that kriging with samples of different support resulted in more precise estimates.  相似文献   
967.
The seismic hazard and risk analysis for the onshore Groningen gas field requires information about local soil properties, in particular shear-wave velocity (VS). A fieldwork campaign was conducted at 18 surface accelerograph stations of the monitoring network. The subsurface in the region consists of unconsolidated sediments and is heterogeneous in composition and properties. A range of different methods was applied to acquire in situ VS values to a target depth of at least 30 m. The techniques include seismic cone penetration tests (SCPT) with varying source offsets, multichannel analysis of surface waves (MASW) on Rayleigh waves with different processing approaches, microtremor array, cross-hole tomography and suspension P-S logging. The offset SCPT, cross-hole tomography and common midpoint cross-correlation (CMPcc) processing of MASW data all revealed lateral variations on length scales of several to tens of metres in this geological setting. SCPTs resulted in very detailed VS profiles with depth, but represent point measurements in a heterogeneous environment. The MASW results represent VS information on a larger spatial scale and smooth some of the heterogeneity encountered at the sites. The combination of MASW and SCPT proved to be a powerful and cost-effective approach in determining representative VS profiles at the accelerograph station sites. The measured VS profiles correspond well with the modelled profiles and they significantly enhance the ground motion model derivation. The similarity between the theoretical transfer function from the VS profile and the observed amplification from vertical array stations is also excellent.  相似文献   
968.
The Itajaí River basin is one of the areas most affected by flood-related disasters in Brazil. Flood hazard maps based on digital elevation models (DEM) are an important alternative in the absence of detailed hydrological data and for application in large areas. We developed a flood hazard mapping methodology by combining flow frequency analysis with the Height Above the Nearest Drainage (HAND) model – f2HAND – and applied it in three municipalities in the Itajaí River basin. The f2HAND performance was evaluated through comparison with observed 2011 flood extent maps. Model performance and sensitivity were tested for different DEM resolutions, return periods and streamflow data from stations located upstream and downstream on the main river. The flood hazard mapping with our combined approach matched 92% of the 2011 flood event. We found that the f2HAND model has low sensitivity to DEM resolution and high sensitivity to area threshold of channel initiation.  相似文献   
969.
After an earthquake, non‐negligible residual displacements may affect the serviceability of a base isolated structure, if the isolation system does not possess a good restoring capability. The permanent offset does not affect the performance unless the design is problematic for utilities, also considering possible concerns related to the maintenance of the devices. Starting from experimental and analytical results of previous studies, the restoring capability of Double Concave Friction Pendulum bearings is investigated in this paper. A simplified design suggestion for the estimation of maximum expected residual displacements for currently used friction pendulum systems is then validated. The study is based on controlled‐displacement and seismic input experiments, both performed under unidirectional motion. Several shaking table tests have been carried out on a three‐dimensional isolated specimen structure. The same sequence of seismic inputs was applied considering three different conditions of sliding surfaces corresponding to low, medium and high friction. The accumulation of residual displacements is also investigated by means of nonlinear dynamic analysis. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   
970.
Recent advances are made in earth surface reconstruction with high spatial resolution due to SfM photogrammetry. High flexibility of data acquisition and high potential of process automation allows for a significant increase of the temporal resolution, as well, which is especially interesting to assess geomorphic changes. Two case studies are presented where 4D reconstruction is performed to study soil surface changes at 15 seconds intervals: (a) a thunderstorm event is captured at field scale and (b) a rainfall simulation is observed at plot scale. A workflow is introduced for automatic data acquisition and processing including the following approach: data collection, camera calibration and subsequent image correction, template matching to automatically identify ground control points in each image to account for camera movements, 3D reconstruction of each acquisition interval, and finally applying temporal filtering to the resulting surface change models to correct random noise and to increase the reliability of the measurement of signals of change with low intensity. Results reveal surface change detection with cm‐ to mm‐accuracy. Significant soil changes are measured during the events. Ripple and pool sequences become obvious in both case studies. Additionally, roughness changes and hydrostatic effects are apparent along the temporal domain at the plot scale. 4D monitoring with time‐lapse SfM photogrammetry enables new insights into geomorphic processes due to a significant increase of temporal resolution. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号