全文获取类型
收费全文 | 11774篇 |
免费 | 2265篇 |
国内免费 | 2680篇 |
专业分类
测绘学 | 1257篇 |
大气科学 | 2088篇 |
地球物理 | 4818篇 |
地质学 | 5411篇 |
海洋学 | 1097篇 |
天文学 | 289篇 |
综合类 | 922篇 |
自然地理 | 837篇 |
出版年
2024年 | 78篇 |
2023年 | 252篇 |
2022年 | 341篇 |
2021年 | 523篇 |
2020年 | 508篇 |
2019年 | 558篇 |
2018年 | 411篇 |
2017年 | 488篇 |
2016年 | 561篇 |
2015年 | 552篇 |
2014年 | 645篇 |
2013年 | 728篇 |
2012年 | 708篇 |
2011年 | 665篇 |
2010年 | 602篇 |
2009年 | 634篇 |
2008年 | 682篇 |
2007年 | 725篇 |
2006年 | 717篇 |
2005年 | 660篇 |
2004年 | 551篇 |
2003年 | 539篇 |
2002年 | 500篇 |
2001年 | 446篇 |
2000年 | 421篇 |
1999年 | 473篇 |
1998年 | 464篇 |
1997年 | 389篇 |
1996年 | 375篇 |
1995年 | 380篇 |
1994年 | 288篇 |
1993年 | 211篇 |
1992年 | 148篇 |
1991年 | 124篇 |
1990年 | 103篇 |
1989年 | 62篇 |
1988年 | 66篇 |
1987年 | 37篇 |
1986年 | 15篇 |
1985年 | 14篇 |
1984年 | 21篇 |
1983年 | 8篇 |
1982年 | 9篇 |
1981年 | 2篇 |
1980年 | 3篇 |
1979年 | 10篇 |
1978年 | 3篇 |
1977年 | 6篇 |
1976年 | 5篇 |
1954年 | 6篇 |
排序方式: 共有10000条查询结果,搜索用时 10 毫秒
51.
根据随钻测录井实时地质导向和大斜度、水平井评价成图技术需求,针对传统绘图方法存在的弊端,提出了将测录井信息、井眼轨迹和地质模型进行二维分解的实时绘图方法。针对二维分解绘图方法绘图时空复杂度较高的问题,给出了不同事件驱动下的局部实时计算和拷屏重绘算法,控制了对CPU和内存的消耗、提高了绘图效率,消除了实时绘图的闪烁和卡顿现象。应用实例表明,二维分解实时绘图方法能够实现大尺度随钻测录井地质导向图形的流畅、无卡顿实时绘图,可提高大斜度、水平井储层模型评价的刻画精度和时效。 相似文献
52.
矿井瓦斯危险程度与煤层中瓦斯赋存状况及其泄出方式有关,并取决于多种地质条件和采掘工艺。其中,煤特征条件特别重要。本文分析了湖南省的5种矿井瓦斯危险类型以及相应的煤特征条件,提出了“煤特征指数(I_c)”这一概念。I_c是一项评价矿井瓦斯危险程度的综合指标。研究表明,矿井瓦斯危险愈严重,则其I_c值愈高。应用该项成果预测了16对矿井的瓦斯危险类型,取得了满意的效果。 相似文献
53.
Krzysztof Gozdziewski Andrzej J. Maciejewski 《Celestial Mechanics and Dynamical Astronomy》1990,49(1):1-10
A software system for normalization of a Hamiltonian function is described. A few examples of its applications are given. It is written in PASCAL and runs on an IBM XT/AT with 640 KB memory. 相似文献
54.
Two different goals in fitting straight lines to data are to estimate a true linear relation (physical law) and to predict values of the dependent variable with the smallest possible error. Regarding the first goal, a Monte Carlo study indicated that the structural-analysis (SA) method of fitting straight lines to data is superior to the ordinary least-squares (OLS) method for estimating true straight-line relations. Number of data points, slope and intercept of the true relation, and variances of the errors associated with the independent (X) and dependent (Y) variables influence the degree of agreement. For example, differences between the two line-fitting methods decrease as error in X becomes small relative to error in Y. Regarding the second goal—predicting the dependent variable—OLS is better than SA. Again, the difference diminishes as X takes on less error relative to Y. With respect to estimation of slope and intercept and prediction of Y, agreement between Monte Carlo results and large-sample theory was very good for sample sizes of 100, and fair to good for sample sizes of 20. The procedures and error measures are illustrated with two geologic examples. 相似文献
55.
56.
Increasing critical sensitivity of the Load/Unload Response Ratio before large earthquakes with identified stress accumulation pattern 总被引:2,自引:0,他引:2
Huai-zhong Yu Zheng-kang Shen Yong-ge Wan Qing-yong Zhu Xiang-chu Yin 《Tectonophysics》2006,428(1-4):87-94
The Load/Unload Response Ratio (LURR) method is proposed for short-to-intermediate-term earthquake prediction [Yin, X.C., Chen, X.Z., Song, Z.P., Yin, C., 1995. A New Approach to Earthquake Prediction — The Load/Unload Response Ratio (LURR) Theory, Pure Appl. Geophys., 145, 701–715]. This method is based on measuring the ratio between Benioff strains released during the time periods of loading and unloading, corresponding to the Coulomb Failure Stress change induced by Earth tides on optimally oriented faults. According to the method, the LURR time series usually climb to an anomalously high peak prior to occurrence of a large earthquake. Previous studies have indicated that the size of critical seismogenic region selected for LURR measurements has great influence on the evaluation of LURR. In this study, we replace the circular region usually adopted in LURR practice with an area within which the tectonic stress change would mostly affect the Coulomb stress on a potential seismogenic fault of a future event. The Coulomb stress change before a hypothetical earthquake is calculated based on a simple back-slip dislocation model of the event. This new algorithm, by combining the LURR method with our choice of identified area with increased Coulomb stress, is devised to improve the sensitivity of LURR to measure criticality of stress accumulation before a large earthquake. Retrospective tests of this algorithm on four large earthquakes occurred in California over the last two decades show remarkable enhancement of the LURR precursory anomalies. For some strong events of lesser magnitudes occurred in the same neighborhoods and during the same time periods, significant anomalies are found if circular areas are used, and are not found if increased Coulomb stress areas are used for LURR data selection. The unique feature of this algorithm may provide stronger constraints on forecasts of the size and location of future large events. 相似文献
57.
Analyzing the tables and probability maps posted by Yan Y. Kagan and David D. Jackson in April 2002–September 2004 at http://scec.ess.ucla.edu/~ykagan/predictions_index.html and the catalog of earthquakes for the same period, the conclusion is drawn that the underlying method could be used for prediction of aftershocks, while it does not outscore random guessing when main shocks are considered. 相似文献
58.
lvaro Gonzlez Miguel Vzquez-Prada Javier B. Gmez Amalio F. Pacheco 《Tectonophysics》2006,424(3-4):319
Numerical models are starting to be used for determining the future behaviour of seismic faults and fault networks. Their final goal would be to forecast future large earthquakes. In order to use them for this task, it is necessary to synchronize each model with the current status of the actual fault or fault network it simulates (just as, for example, meteorologists synchronize their models with the atmosphere by incorporating current atmospheric data in them). However, lithospheric dynamics is largely unobservable: important parameters cannot (or can rarely) be measured in Nature. Earthquakes, though, provide indirect but measurable clues of the stress and strain status in the lithosphere, which should be helpful for the synchronization of the models.The rupture area is one of the measurable parameters of earthquakes. Here we explore how it can be used to at least synchronize fault models between themselves and forecast synthetic earthquakes. Our purpose here is to forecast synthetic earthquakes in a simple but stochastic (random) fault model. By imposing the rupture area of the synthetic earthquakes of this model on other models, the latter become partially synchronized with the first one. We use these partially synchronized models to successfully forecast most of the largest earthquakes generated by the first model. This forecasting strategy outperforms others that only take into account the earthquake series. Our results suggest that probably a good way to synchronize more detailed models with real faults is to force them to reproduce the sequence of previous earthquake ruptures on the faults. This hypothesis could be tested in the future with more detailed models and actual seismic data. 相似文献
59.
P. D. Katsabanis A. Tawadrous C. Braun C. Kennedy 《Fragblast: International Journal for Blasting and Fragmentation》2006,10(1):83-93
A series of small scale tests, simulating multi-hole blasts have been performed to establish the effect of delays on blast fragmentation. The blasts were performed in high quality granodiorite blocks, which were cut from stone prepared by dimensional stone quarry operations. The pattern used was equilateral triangular, with a distance of 10.2 cm between boreholes, which had a diameter of 11 mm, were loaded with detonating cord and the coupling medium was water. The delays used were achieved using different lengths of detonating cord for the cases of delays between 0 and 100 μs between holes and a sequential blasting machine firing seismic detonators for larger delays up to 4 ms. All fragments were collected and screened. The experiments showed that the worst fragmentation was achieved with simultaneous initiation of all charges. Fragmentation improved with the delay time between holes up to 1 ms between holes. If the experiments are scaled up, the results show that in granodiorite, fragmentation optimization requires delays of few milliseconds per metre of burden. The findings, agree with previously published work, involving larger scale experiments and other rock types. 相似文献
60.
Faisal Hossain 《Natural Hazards》2006,37(3):263-276
The three most important components necessary for functioning of an operational flood warning system are: (1) a rainfall measuring
system; (2) a soil moisture updating system; and, (3) a surface discharge measuring system. Although surface based networks
for these systems can be largely inadequate in many parts of the world, this inadequacy particularly affects the tropics,
which are most vulnerable to flooding hazards. Furthermore, the tropical regions comprise developing countries lacking the
financial resources for such surface-based monitoring. The heritage of research conducted on evaluating the potential for
measuring discharge from space has now morphed into an agenda for a mission dedicated to space-based surface discharge measurements.
This mission juxtaposed with two other upcoming space-based missions: (1) for rainfall measurement (Global Precipitation Measurement,
GPM), and (2) soil moisture measurement (Hydrosphere State, HYDROS), bears promise for designing a fully space-borne system
for early warning of floods. Such a system, if operational, stands to offer tremendous socio-economic benefit to many flood-prone
developing nations of the tropical world. However, there are two competing aspects that need careful assessment to justify
the viability of such a system: (1) cost-effectiveness due to surface data scarcity; and (2) flood prediction uncertainty
due to uncertainty in the remote sensing measurements. This paper presents the flood hazard mitigation opportunities offered
by the assimilation of the three proposed space missions within the context of these two competing aspects. The discussion
is cast from the perspective of current understanding of the prediction uncertainties associated with space-based flood prediction.
A conceptual framework for a fully space-borne system for early-warning of floods is proposed. The need for retrospective
validation of such a system on historical data comprising floods and its associated socio-economic impact is stressed. This
proposal for a fully space-borne system, if pursued through wide interdisciplinary effort as recommended herein, promises
to enhance the utility of the three space missions more than what their individual agenda can be expected to offer. 相似文献