首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   27151篇
  免费   515篇
  国内免费   287篇
测绘学   670篇
大气科学   2025篇
地球物理   5927篇
地质学   9482篇
海洋学   2240篇
天文学   5702篇
综合类   46篇
自然地理   1861篇
  2020年   166篇
  2019年   153篇
  2018年   326篇
  2017年   296篇
  2016年   470篇
  2015年   338篇
  2014年   475篇
  2013年   1283篇
  2012年   563篇
  2011年   885篇
  2010年   705篇
  2009年   989篇
  2008年   900篇
  2007年   878篇
  2006年   897篇
  2005年   793篇
  2004年   813篇
  2003年   755篇
  2002年   777篇
  2001年   621篇
  2000年   636篇
  1999年   604篇
  1998年   574篇
  1997年   586篇
  1996年   492篇
  1995年   498篇
  1994年   468篇
  1993年   434篇
  1992年   403篇
  1991年   355篇
  1990年   399篇
  1989年   317篇
  1988年   370篇
  1987年   399篇
  1986年   348篇
  1985年   510篇
  1984年   557篇
  1983年   565篇
  1982年   448篇
  1981年   444篇
  1980年   462篇
  1979年   409篇
  1978年   415篇
  1977年   375篇
  1976年   400篇
  1975年   361篇
  1974年   392篇
  1973年   384篇
  1972年   251篇
  1971年   204篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
981.
In the framework of the Space Situational Awareness program of the European Space Agency (ESA/SSA), an automatic flare detection system was developed at Kanzelhöhe Observatory (KSO). The system has been in operation since mid-2013. The event detection algorithm was upgraded in September 2017. All data back to 2014 was reprocessed using the new algorithm. In order to evaluate both algorithms, we apply verification measures that are commonly used for forecast validation. In order to overcome the problem of rare events, which biases the verification measures, we introduce a new event-based method. We divide the timeline of the H\(\upalpha\) observations into positive events (flaring period) and negative events (quiet period), independent of the length of each event. In total, 329 positive and negative events were detected between 2014 and 2016. The hit rate for the new algorithm reached 96% (just five events were missed) and a false-alarm ratio of 17%. This is a significant improvement of the algorithm, as the original system had a hit rate of 85% and a false-alarm ratio of 33%. The true skill score and the Heidke skill score both reach values of 0.8 for the new algorithm; originally, they were at 0.5. The mean flare positions are accurate within \({\pm}\,1\) heliographic degree for both algorithms, and the peak times improve from a mean difference of \(1.7\pm 2.9~\mbox{minutes}\) to \(1.3\pm 2.3~\mbox{minutes}\). The flare start times that had been systematically late by about 3 minutes as determined by the original algorithm, now match the visual inspection within \(-0.47\pm 4.10~\mbox{minutes}\).  相似文献   
982.
During the last two decades, the first generation of beam combiners at the Very Large Telescope Interferometer has proved the importance of optical interferometry for high-angular resolution astrophysical studies in the near- and mid-infrared. With the advent of 4-beam combiners at the VLTI, the u ? v coverage per pointing increases significantly, providing an opportunity to use reconstructed images as powerful scientific tools. Therefore, interferometric imaging is already a key feature of the new generation of VLTI instruments, as well as for other interferometric facilities like CHARA and JWST. It is thus imperative to account for the current image reconstruction capabilities and their expected evolutions in the coming years. Here, we present a general overview of the current situation of optical interferometric image reconstruction with a focus on new wavelength-dependent information, highlighting its main advantages and limitations. As an Appendix we include several cookbooks describing the usage and installation of several state-of-the art image reconstruction packages. To illustrate the current capabilities of the software available to the community, we recovered chromatic images, from simulated MATISSE data, using the MCMC software SQUEEZE. With these images, we aim at showing the importance of selecting good regularization functions and their impact on the reconstruction.  相似文献   
983.
<正> 中国北方河北、山西两省交界地带的太行—五台山区,早前寒武纪变质岩系十分发育,分布面积达20,000平方公里,为华北地块的重要组成部分。在这一地区内,保留了早前寒武纪不同发展阶段的种种地质作用的记录。深入研究其地质历史,将对阐明华北地块的形成和演化具有重要的典型意义。作者正是基于这一目的,在该区进行了同位素地质年代学的研究。  相似文献   
984.
Many studies on invasive species show reduced native densities, but few studies measure trait‐mediated effects as mechanisms for changes in native growth rates and population dynamics. Where native prey face invasive predators, mechanisms for phenotypic change include selective predation, or induced behavioral or morphological plasticity. Invasive green crabs, Carcinus maenas, have contributed to declines in native soft‐shell clams, Mya arenaria, in coastal New England, USA. We tested the hypothesis that clam ability to detect chemical cues from predators or damaged conspecifics would induce greater burrowing depth as a refuge from invasive crabs, and greater burrowing would require increased siphon growth. To determine how crab predation affected clam survivorship and phenotypic traits in the field, clams in exclosure, open, and crab enclosure plots were compared. Crab predation reduced clam density, and surviving clams were deeper and larger, with longer siphons. To determine whether the mechanism for these results was selective predation or induced plasticity, phenotypes were compared between clams exposed to chemical cues from crab predation and clams exposed to seawater in laboratory and field experiments. In response to crab predation cues, clams burrowed deeper, with longer siphons and greater siphon mass. Overall, crab predation removed clams with shorter siphons at shallow depths, and crab predation cues induced greater burrowing depths and longer siphons. Longer siphons and greater siphon mass of deeper clams suggests clams may allocate energy to siphon growth in response to crabs. By determining native behavior and morphological changes in response to an invasive predator, this study adds to our understanding of mechanisms for invasive impacts and illustrates the utility of measuring trait‐mediated effects to investigate predator–prey dynamics.  相似文献   
985.
Quadrature-based approach for the efficient evaluation of surge hazard   总被引:3,自引:0,他引:3  
The Joint Probability Method (JPM) has been used for hurricane surge frequency analysis for over three decades, and remains the method of choice owing to the limitations of more direct historical methods. However, use of the JPM approach in conjunction with the modern generation of complex high-resolution numerical models (used to describe winds, waves, and surge) has become highly inefficient, owing to the large number of costly storm simulations that are typically required. This paper describes a new approach to the selection of the storm simulation set that permits reduction of the JPM computational effort by about an order of magnitude (compared to a more conventional approach) while maintaining good accuracy. The method uses an integration scheme called Bayesian or Gaussian-process quadrature (together with conventional integration methods) to evaluate the multi-dimensional joint probability integral over the space of storm parameters (pressure, radius, speed, heading, and any others found to be important) as a weighted summation over a relatively small set of optimally selected nodes (synthetic storms). Examples of an application of the method are shown, drawn from the recent post-Katrina study of coastal Mississippi.  相似文献   
986.
We analyze errors in the global bathymetry models of Smith and Sandwell that combine satellite altimetry with acoustic soundings and shorelines to estimate depths. Versions of these models have been incorporated into Google Earth and the General Bathymetric Chart of the Oceans (GEBCO). We use Japan Agency for Marine-Earth Science and Technology (JAMSTEC) multibeam surveys not previously incorporated into the models as “ground truth” to compare against model versions 7.2 through 12.1, defining vertical differences as “errors.” Overall error statistics improve over time: 50th percentile errors declined from 57 to 55 to 49 m, and 90th percentile errors declined from 257 to 235 to 219 m, in versions 8.2, 11.1 and 12.1. This improvement is partly due to an increasing number of soundings incorporated into successive models, and partly to improvements in the satellite gravity model. Inspection of specific sites reveals that changes in the algorithms used to interpolate across survey gaps with altimetry have affected some errors. Versions 9.1 through 11.1 show a bias in the scaling from gravity in milliGals to topography in meters that affected the 15–160 km wavelength band. Regionally averaged (>160 km wavelength) depths have accumulated error over successive versions 9 through 11. These problems have been mitigated in version 12.1, which shows no systematic variation of errors with depth. Even so, version 12.1 is in some respects not as good as version 8.2, which employed a different algorithm.  相似文献   
987.
A promising method for gas hydrates exploration incorporates pre-stack seismic inversion data, elastic properties modeling, and seismic interpretation to predict saturation of gas hydrates (Sgh). The technology can be modified slightly and used for predicting hydrate concentrations in shallow arctic locations as well. Examples from Gulf of Mexico Walker Ridge (WR) and Green Canyon (GC) protraction areas illustrate how Sgh was derived and used to support the selection of well locations to be drilled for gas hydrates in sand reservoirs by the Chevron-led Joint Industry Project (JIP) Leg II cruise in 2009. Concentrations of hydrates were estimated through the integration of seismic inversion of carefully conditioned pre-stack data, seismic stratigraphic interpretation, and shallow rock property modeling. Rock property trends were established by applying principles of rock physics and shallow sediment compaction, constrained by regional geological knowledge. No nearby sonic or density logs were available to define the elastic property trends in the zone of interest. Sgh volumes were generated by inverting pre-stack data to acoustic and shear impedance (PI and SI) volumes, and then analyzing deviations from modeled impedance trends. In order to enhance the quality of the inversion, we stress the importance of maximizing the signal to noise ratio of the offset data by conditioning seismic angle gathers prior to inversion. Seismic interpretation further plays an important role by identifying false anomalies such as hard, compact strata, which can produce apparent high Sgh values, and by identifying the more promising strata and structures for containing the hydrates. This integrated workflow presents a highly promising methodology, appropriate for the exploration of gas hydrates.  相似文献   
988.
Submarine groundwater discharge (SGD) assessments were conducted both in the laboratory and at a field site in the northeastern Gulf of Mexico, using a continuous heat-type automated seepage meter (seepmeter). The functioning of the seepmeter is based on measurements of a temperature gradient in the water between downstream and upstream positions in its flow pipe. The device has the potential of providing long-term, high-resolution measurements of SGD. Using a simple inexpensive laboratory set-up, we have shown that connecting an extension cable to the seepmeter has a negligible effect on its measuring capability. Similarly, the observed influence of very low temperature (≤3 °C) on seepmeter measurements can be accounted for by conducting calibrations at such temperatures prior to field deployments. Compared to manual volumetric measurements, calibration experiments showed that at higher water flow rates (>28 cm day−1 or cm3 cm−2 day−1) an analog flowmeter overestimated flow rates by ≥7%. This was apparently due to flow resistance, turbulence and formation of air bubbles in the seepmeter water flow tubes. Salinity had no significant effect on the performance of the seepmeter. Calibration results from fresh water and sea water showed close agreement at a 95% confidence level significance between the data sets from the two media (R2 = 0.98). Comparatively, the seepmeter SGD measurements provided data that are comparable to manually-operated seepage meters, the radon geochemical tracer approach, and an electromagnetic (EM) seepage meter.  相似文献   
989.
This study examines the distribution of leachable particulate iron (Fe) in the Columbia River, estuary, and near-field plume. Surface samples were collected during late spring and summer of 2004–2006 as part of four River Influence on Shelf Ecosystems (RISE) cruises. Tidal amplitude and river flow are the primary factors influencing the estuary leachable particulate Fe concentrations, with greater values during high flow and/or spring tides. Near the mouth of the estuary, leachable particulate Fe [defined as the particulate Fe solubilized with a 25% acetic acid (pH 2) leach containing a weak reducing agent to reduce Fe oxyhydroxides and a short heating step to access intracellular Fe] averaged 770 nM during either spring tide or high flow, compared to 320 nM during neap tide, low flow conditions. In the near-field Columbia River plume, elevated leachable particulate Fe concentrations occur during spring tides and/or higher river flow, with resuspended shelf sediment as an additional source to the plume during periods of coastal upwelling and spring tides. Near-field plume concentrations of leachable particulate Fe (at a salinity of 20) averaged 660 nM during either spring tide or high flow, compared to 300 nM during neap tide, low flow conditions. Regardless of tidal amplitude and river flow, leachable particulate Fe concentrations in both the river/estuary and near-field plume are consistently one to two orders of magnitude greater than dissolved Fe concentrations. The Columbia River is an important source of reactive Fe to the productive coastal waters off Oregon and Washington, and leachable particulate Fe is available for solubilization following biological drawdown of the dissolved phase. Elevated leachable Fe concentrations allow coastal waters influenced by the Columbia River plume to remain Fe-replete and support phytoplankton production during the spring and summer seasons.  相似文献   
990.
In September 2008, Hurricanes Gustav and Ike generated major storm surges which impacted the Lake Pontchartrain estuary in Louisiana. This paper presents analyses of in situ measurements acquired during these storm events. The main data used in the analyses were from three bottom mounted moorings equipped with conductivity, temperature, and depth sensors, acoustic Doppler current profilers (ADCPs), and a semi-permanent laterally mounted horizontal acoustic Doppler profiler (ADP). These moorings were deployed in the three major tidal channels that connect Lake Pontchartrain with the coastal ocean. A process similar to tidal straining was observed: the vertical shear of the horizontal velocity was negligible during the inundation stage, but a shear of 0.8 m/s over a less than 5 m water column was recorded during the receding stage, 2–3 times the normal tidal oscillations. The surge reached its peak in the Industrial Canal 1.4–2.1 h before those in the other two channels. The inward flux of water lasted for a shorter time period than that of the outward flux. The inward flux was also observed to have much smaller magnitude than the outward flux (∼960–1200 vs. 2100–3100 million m3). The imbalance was believed to have been caused by the additional water into Lake Pontchartrain through some small rivers and inundation over the land plus rainfall from the hurricanes. The flux through the Industrial Canal was 8–12%, while the flux through the other two tidal passes ranged between 17% and 70% of the total, but mostly split roughly half-half of the remaining (∼88–92% of the total).  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号