首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   13582篇
  免费   3167篇
  国内免费   1090篇
测绘学   597篇
大气科学   416篇
地球物理   11883篇
地质学   2856篇
海洋学   391篇
天文学   53篇
综合类   1113篇
自然地理   530篇
  2024年   53篇
  2023年   183篇
  2022年   333篇
  2021年   507篇
  2020年   464篇
  2019年   522篇
  2018年   511篇
  2017年   492篇
  2016年   397篇
  2015年   560篇
  2014年   736篇
  2013年   756篇
  2012年   724篇
  2011年   786篇
  2010年   703篇
  2009年   970篇
  2008年   728篇
  2007年   816篇
  2006年   791篇
  2005年   771篇
  2004年   707篇
  2003年   668篇
  2002年   569篇
  2001年   503篇
  2000年   462篇
  1999年   398篇
  1998年   437篇
  1997年   377篇
  1996年   397篇
  1995年   328篇
  1994年   315篇
  1993年   258篇
  1992年   196篇
  1991年   116篇
  1990年   87篇
  1989年   56篇
  1988年   53篇
  1987年   36篇
  1986年   21篇
  1985年   14篇
  1984年   6篇
  1983年   2篇
  1982年   2篇
  1981年   1篇
  1980年   1篇
  1979年   16篇
  1978年   3篇
  1977年   1篇
  1976年   1篇
  1954年   5篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
101.
The Latur earthquake (Mw 6.1) of 29 September 1993 is a rare stable continental region (SCR) earthquake that occurred on a previously unknown blind fault. In this study, we determined detailed three-dimensional (3-D) P- and S-wave velocity (Vp, Vs) and Poisson's ratio (σ) structures by inverting the first P- and S-wave high-quality arrival time data from 142 aftershocks that were recorded by a network of temporary seismic stations. The source zone of the Latur earthquake shows strong lateral heterogeneities in Vp, Vs and σ structures, extending in a volume of about 90 × 90 × 15 km3. The mainshock occurred within, but near the boundary, of a low-Vp, high-Vs and low-σ zone. This suggests that the structural asperities at the mainshock hypocenter are associated with a partially fluid-saturated fractured rock in a previously unknown source zone with intersecting fault surfaces. This might have triggered the 1993 Latur mainshock and its aftershock sequence. Our results are in good agreement with other geophysical studies that suggest high conductivity and high concentration of radiogenic helium gas beneath the source zone of the Latur earthquake. Our study provides an additional evidence for the presence of fluid related anomaly at the hidden source zone of the Latur earthquake in the SCR and helps us understand the genesis of damaging earthquakes in the SCR of the world.  相似文献   
102.
Numerical models are starting to be used for determining the future behaviour of seismic faults and fault networks. Their final goal would be to forecast future large earthquakes. In order to use them for this task, it is necessary to synchronize each model with the current status of the actual fault or fault network it simulates (just as, for example, meteorologists synchronize their models with the atmosphere by incorporating current atmospheric data in them). However, lithospheric dynamics is largely unobservable: important parameters cannot (or can rarely) be measured in Nature. Earthquakes, though, provide indirect but measurable clues of the stress and strain status in the lithosphere, which should be helpful for the synchronization of the models.The rupture area is one of the measurable parameters of earthquakes. Here we explore how it can be used to at least synchronize fault models between themselves and forecast synthetic earthquakes. Our purpose here is to forecast synthetic earthquakes in a simple but stochastic (random) fault model. By imposing the rupture area of the synthetic earthquakes of this model on other models, the latter become partially synchronized with the first one. We use these partially synchronized models to successfully forecast most of the largest earthquakes generated by the first model. This forecasting strategy outperforms others that only take into account the earthquake series. Our results suggest that probably a good way to synchronize more detailed models with real faults is to force them to reproduce the sequence of previous earthquake ruptures on the faults. This hypothesis could be tested in the future with more detailed models and actual seismic data.  相似文献   
103.
The Great Lisbon earthquake has the largest documented felt area of any shallow earthquake and an estimated magnitude of 8.5–9.0. The associated tsunami ravaged the coast of SW Portugal and the Gulf of Cadiz, with run-up heights reported to have reached 5–15 m. While several source regions offshore SW Portugal have been proposed (e.g.— Gorringe Bank, Marquis de Pombal fault), no single source appears to be able to account for the great seismic moment as well as all the historical tsunami amplitude and travel time observations. A shallow east dipping fault plane beneath the Gulf of Cadiz associated with active subduction beneath Gibraltar, represents a candidate source for the Lisbon earthquake of 1755.Here we consider the fault parameters implied by this hypothesis, with respect to total slip, seismic moment, and recurrence interval to test the viability of this source. The geometry of the seismogenic zone is obtained from deep crustal studies and can be represented by an east dipping fault plane with mean dimensions of 180 km (N–S) × 210 km (E–W). For 10 m of co-seismic slip an Mw 8.64 event results and for 20 m of slip an Mw 8.8 earthquake is generated. Thus, for convergence rates of about 1 cm/yr, an event of this magnitude could occur every 1000–2000 years. Available kinematic and sedimentological data are in general agreement with such a recurrence interval. Tsunami wave form modeling indicates a subduction source in the Gulf of Cadiz can partly satisfy the historical observations with respect to wave amplitudes and arrival times, though discrepancies remain for some stations. A macroseismic analysis is performed using site effect functions calculated from isoseismals observed during instrumentally recorded strong earthquakes in the region (M7.9 1969 and M6.8 1964). The resulting synthetic isoseismals for the 1755 event suggest a subduction source, possibly in combination with an additional source at the NW corner of the Gulf of Cadiz can satisfactorily explain the historically observed seismic intensities. Further studies are needed to sample the turbidites in the adjacent abyssal plains to better document the source region and more precisely calibrate the chronology of great earthquakes in this region.  相似文献   
104.
Ice and snow have often helped physicists understand the world. On the contrary it has taken them a very long time to understand the flow of the glaciers. Naturalists only began to take an interest in glaciers at the beginning of the 19th century during the last phase of glacier advances. When the glacier flow from the upslope direction became obvious, it was then necessary to understand how it flowed. It was only in 1840, the year of the Antarctica ice sheet discovery by Dumont d'Urville, that two books laid the basis for the future field of glaciology: one by Agassiz on the ice age and glaciers, the other one by canon Rendu on glacier theory. During the 19th century, ice flow theories, adopted by most of the leading scientists, were based on melting/refreezing processes. Even though the word ‘fluid’ was first used in 1773 to describe ice, more the 130 years would have to go by before the laws of fluid mechanics were applied to ice. Even now, the parameter of Glen's law, which is used by glaciologists to model ice deformation, can take a very wide range of values, so that no unique ice flow law has yet been defined. To cite this article: F. Rémy, L. Testut, C. R. Geoscience 338 (2006).  相似文献   
105.
张维正 《探矿工程》2006,33(10):60-62
以某杂货码头和集装箱重力式码头水下地基处理工程为例,介绍了其地基处理的爆夯试验过程,得出了一次性爆破夯实厚层块石抛石基床的爆夯参数,并对其施工过程进行了监测。总结了类似工程施工的经验与教训。  相似文献   
106.
The area of Serravalle, sited in the northern part of the town of Vittorio Veneto (TV), NE Italy, has been the target of a seismic microzonation campaign. 10 seismic stations have been deployed for a 7 months period to record in continuous mode. Three stations were installed on bedrock outcrops and seven on sedimentary sites with variable cover thickness. Spectral analyses have been performed on the collected data-set using the Generalized Inversion Technique (GIT, e.g. Andrews, 1986). In particular, spectral ratios have been calculated for each station relatively to the average of the three reference, bedrock sites. The spectral ratios provide quantitative estimates of the seismic motion amplifications which occur in each of the monitored sites. Two sites show high values of amplification, 5 times larger than signal amplitude at the reference sites, in correspondence of well discernible peak frequencies of 5 Hz. Results for the other stations show smaller amounts of site amplification spreading over a broad range of frequencies. Sites where the highest amplifications were recorded all lie on the left bank of the Meschio River and in areas farther away from its outlet into the plain correlating with the presence of thick layers of Quaternary deposits.  相似文献   
107.
A new earthquake catalogue for central, northern and northwestern Europe with unified Mw magnitudes, in part derived from chi-square maximum likelihood regressions, forms the basis for seismic hazard calculations for the Lower Rhine Embayment. Uncertainties in the various input parameters are introduced, a detailed seismic zonation is performed and a recently developed technique for maximum expected magnitude estimation is adopted and quantified. Applying the logic tree algorithm, resulting hazard values with error estimates are obtained as fractile curves (median, 16% and 84% fractiles and mean) plotted for pga (peak ground acceleration; median values for Cologne 0.7 and 1.2 m/s2 for probabilities of exceedence of 10% and 2%, respectively, in 50 years), 0.4 s (0.8 and 1.5 m/s2) and 1.0 s (0.3 and 0.5 m/s2) pseudoacclerations, and intensity (I0 = 6.5 and 7.2). For the ground motion parameters, rock foundation is assumed. For the area near Cologne and Aachen, maps show the median and 84% fractile hazard for 2% probability of exceedence in 50 years based on pga (maximum median value about 1.5 m/s2), and 0.4 s (>2 m/s2) and 1.0 s (about 0.8 m/s2) pseudoaccelerations, all for rock. The pga 84% fractile map also has a maximum value above 2 m/s2 and shows similarities with the median map for 0.4 s. In all maps, the maximum values fall within the area 6.2–6.3° E and 50.8–50.9° N, i.e., east of Aachen.  相似文献   
108.
Histograms of observations from spatial phenomena are often found to be more heavy-tailed than Gaussian distributions, which makes the Gaussian random field model unsuited. A T-distributed random field model with heavy-tailed marginal probability density functions is defined. The model is a generalization of the familiar Student-T distribution, and it may be given a Bayesian interpretation. The increased variability appears cross-realizations, contrary to in-realizations, since all realizations are Gaussian-like with varying variance between realizations. The T-distributed random field model is analytically tractable and the conditional model is developed, which provides algorithms for conditional simulation and prediction, so-called T-kriging. The model compares favourably with most previously defined random field models. The Gaussian random field model appears as a special, limiting case of the T-distributed random field model. The model is particularly useful whenever multiple, sparsely sampled realizations of the random field are available, and is clearly favourable to the Gaussian model in this case. The properties of the T-distributed random field model is demonstrated on well log observations from the Gullfaks field in the North Sea. The predictions correspond to traditional kriging predictions, while the associated prediction variances are more representative, as they are layer specific and include uncertainty caused by using variance estimates.  相似文献   
109.
This study investigates the extent to which people's views on the causes and preventability of earthquake damage might be influenced by their degree of exposure to hazard as well as what information they have been given about the hazard. The results show that the provision of hazard zoning information influences judgements on preventability and causes of damage, but this effect depends on the degree of hazard faced by residents. In low hazard zones, information leads to the view that causes are manageable, whereas in high hazard zones information may induce a degree of fatalism. The use of public information in risk management needs to take into account the degree of risk faced by the recipients.  相似文献   
110.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号