首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   170篇
  免费   2篇
测绘学   1篇
大气科学   1篇
地球物理   12篇
地质学   8篇
天文学   138篇
自然地理   12篇
  2022年   1篇
  2021年   1篇
  2020年   1篇
  2018年   2篇
  2016年   1篇
  2013年   1篇
  2012年   2篇
  2009年   13篇
  2008年   12篇
  2007年   14篇
  2006年   15篇
  2005年   13篇
  2004年   15篇
  2003年   13篇
  2002年   11篇
  2001年   12篇
  2000年   11篇
  1999年   5篇
  1998年   10篇
  1997年   2篇
  1996年   2篇
  1995年   3篇
  1994年   6篇
  1993年   1篇
  1992年   1篇
  1991年   2篇
  1990年   1篇
  1989年   1篇
排序方式: 共有172条查询结果,搜索用时 45 毫秒
1.
2.
In this, the third in a series of three papers concerning the SuperCOSMOS Sky Survey, we describe the astrometric properties of the data base. We describe the algorithms employed in the derivation of the astrometric parameters of the data, and demonstrate their accuracies by comparison with external data sets using the first release of data, the South Galactic Cap survey. We show that the celestial coordinates, which are tied to the International Celestial Reference Frame via the Tycho–2 reference catalogue, are accurate to better than ±0.2 arcsec at J , R ∼19,18 , rising to ±0.3 arcsec at J , R ∼22,21 , with positional-dependent systematic effects from bright to faint magnitudes at the ∼0.1-arcsec level. The proper motion measurements are shown to be accurate to typically ±10 mas yr−1 at J , R ∼19,18 , rising to ±50 mas yr−1 at J , R ∼22,21 , and are tied to zero using the extragalactic reference frame. We show that the zero-point errors in the proper motions are ≤1 mas yr−1 for R >17 , and are no larger than ∼10 mas yr−1 for R <17 mas yr−1 .  相似文献   
3.
4.
5.
We apply the ztrace algorithm to the optical NOG and infrared PSC z galaxy catalogues to reconstruct the pattern of primordial fluctuations that have generated our local Universe. We check that the density fields traced by the two catalogues are well correlated, and consistent with a linear relation [either in δ or in  log (1 +δ)  ] with relative bias (of NOG with respect to PSC z )   b rel= 1.1 ± 0.1  . The relative bias relation is used to fill the optical zone of avoidance at  | b | < 20°  using the PSC z galaxy density field.
We perform extensive testing on simulated galaxy catalogues to optimize the reconstruction. The quality of the reconstruction is predicted to be good at large scales, up to a limiting wavenumber   k lim≃ 0.4 h Mpc−1  beyond which all information is lost. We find that the improvement arising from the denser sampling of the optical catalogue is compensated by the uncertainties connected to the larger zone of avoidance.
The initial conditions reconstructed from the NOG catalogue are found (analogously to those from the PSC z ) to be consistent with a Gaussian paradigm. We use the reconstructions to produce sets of initial conditions ready to be used for constrained simulations of our local Universe.  相似文献   
6.
7.
8.
The new procedure of earthquake hazard evaluation developed by Kijko and Sellevoll is tested and applied for the border region of Czechoslovakia and Poland. The new method differs from the conventional approach. It incorporates the uncertainty of earthquake magnitudes, and accepts mixed data containing only large historical events and recent, complete catalogues. Seismic hazard has been calculated for nine regions determined in the border area. In the investigated area, data of historical catalogues are uncertain or, in many cases, the epicentral intensities are unknown. Thus, a number of assumptions have to be adopted in data preparation of catalogues since the year 1200. The calculated values of parameters b in the Gutenberg-Richter frequency-intensity relation as well as the return periods, seem to be reasonable and are generally confirmed by the results obtained from catalogues for the last 80–130 years.  相似文献   
9.
Numerical models are starting to be used for determining the future behaviour of seismic faults and fault networks. Their final goal would be to forecast future large earthquakes. In order to use them for this task, it is necessary to synchronize each model with the current status of the actual fault or fault network it simulates (just as, for example, meteorologists synchronize their models with the atmosphere by incorporating current atmospheric data in them). However, lithospheric dynamics is largely unobservable: important parameters cannot (or can rarely) be measured in Nature. Earthquakes, though, provide indirect but measurable clues of the stress and strain status in the lithosphere, which should be helpful for the synchronization of the models.The rupture area is one of the measurable parameters of earthquakes. Here we explore how it can be used to at least synchronize fault models between themselves and forecast synthetic earthquakes. Our purpose here is to forecast synthetic earthquakes in a simple but stochastic (random) fault model. By imposing the rupture area of the synthetic earthquakes of this model on other models, the latter become partially synchronized with the first one. We use these partially synchronized models to successfully forecast most of the largest earthquakes generated by the first model. This forecasting strategy outperforms others that only take into account the earthquake series. Our results suggest that probably a good way to synchronize more detailed models with real faults is to force them to reproduce the sequence of previous earthquake ruptures on the faults. This hypothesis could be tested in the future with more detailed models and actual seismic data.  相似文献   
10.
Data from 25 local catalogues and 30special studies of earthquakes in central,northern and northwestern Europe have beenincorporated into a Databank. The dataprocessing includes discriminating eventtypes, eliminating fake events and dupletsand converting different magnitudes andintensities to Mw if this is not givenby the original source. The magnitudeconversion is a key task of the study andimplies establishment of regressionequations where no local relations exist.The Catalogue contains tectonic events fromthe Databank within the area44°N–72°N,25°W–32°E and the time period1300–1993. The lower magnitude level forthe Catalogue entries is setat Mw == 3.50. The area covered by thedifferent catalogues are associated withpolygons. Within each polygon only datafrom one or a small number of the localcatalogues, supplemented by data fromspecial studies, enter the Catalogue. Ifthere are two or more such catalogues orstudies providing a solution for an event,a priority algorithm selects one entry forthe Catalogue. Then Mw is calculatedfrom one of the magnitude types, or frommacroseismic data, given by the selectedentry according to another priority scheme.The origin time, location, Mw magnitude and reference are specified for eachentry of the Catalogue. So is theepicentral intensity, I0, if providedby the original source. Following thesecriteria, a total of about 5,000earthquakes constitute the Catalogue.Although originally derived for the purposeof seismic hazard calculation within GSHAP,the Catalogue provides a data base for manytypes of seismicity and seismic hazardstudies.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号