首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 687 毫秒
1.
2.
VOEvent packets describe transient space events, so the Simple Time Access Protocol (STAP) specification provides a useful time‐based query mechanism. Once event metadata has been extracted from VOEvent packets to a relational database, an AstroGrid STAP client web service can be configured to query the events. The resulting STAP service is registered with an IVOA compliant registry, and users can then query VOEvent archives through virtual observatory data searching applications such as the AstroGrid VOScope application. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

3.
A trivial modification to the XML schema of VOEvent v1.1 allows the inclusion of W3C digital signatures. Signatures enable identification, identification enables trust, and trust enables authorization. Such changes would inhibit abuse of the VOEvent networks. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

4.
This article describes the VO registry concept and how it can be used to find resources – data sets, services and infrastructure – to support followup activities. It also discusses how VOEvent infrastructure components can be incorporated. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

5.
Networks are becoming a key element in most current and all future, telescope and observatory projects. The ability to easily and efficiently pass observation data, alert data and instrumentation requests between distributed systems could enable science as never before. However, any effective large scale or meta‐network of astronomical resources will require a common communication format or development resources will have to be continuously dedicated to creating interpreters. The necessary elements of any astronomy communication can be easily identified, efficiently described and rigidly formatted so that both robotic and human operations can use the same data. In this paper we will explore the current state of notification, what notification requirements are essential to create a successful standard and present a standard now under development by the International Virtual Observatory Alliance (IVOA), called the VOEvent. (© 2006 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

6.
The technique of gravitational microlensing is currently unique in its ability to provide a sample of terrestrial exoplanets around both Galactic disk and bulge stars, allowing to measure their abundance and determine their distribution with respect to mass and orbital separation. Thus, valuable information for testing models of planet formation and orbital migration is gathered, constituting an important piece in the puzzle for the existence of life forms throughout the Universe. In order to achieve these goals in reasonable time, a well‐coordinated effort involving a network of either 2m or 4×1m telescopes at each site is required. It could lead to the first detection of an Earth‐mass planet outside the Solar system, and even planets less massive than Earth could be discovered. From April 2008, ARTEMiS (Automated Robotic Terrestrial Exoplanet Microlensing Search) is planned to provide a platform for a three‐step strategy of survey, follow‐up, and anomaly monitoring. As an expert system embedded in eSTAR (e‐Science Telescopes for Astronomical Research), ARTEMiS will give advice for follow‐up based on a priority algorithm that selects targets to be observed in order to maximize the expected number of planet detections, and will also alert on deviations from ordinary microlensing light curves by means of the SIGNALMEN anomaly detector. While the use of the VOEvent (Virtual Observatory Event) protocol allows a direct interaction with the telescopes that are part of the HTN (Heterogeneous Telescope Networks) consortium, additional interfaces provide means of communication with all existing microlensing campaigns that rely on human observers. The success of discovering a planet by microlensing critically depends on the availability of a telescope in a suitable location at the right time, which can mean within 10 min. To encourage follow‐up observations, microlensing campaigns are therefore releasing photometric data in real time. On ongoing planetary anomalies, world‐wide efforts are being undertaken to make sure that sufficient data are obtained, since there is no second chance. Real‐time modelling offers the opportunity of live discovery of extra‐solar planets, thereby providing “Science live to your home”. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

7.
A time series is a sample of observations of well‐defined data points obtained through repeated measurements over a certain time range. The analysis of such data samples has become increasingly important not only in natural science but also in many other fields of research. Peranso offers a complete set of powerful light curve and period analysis functions to work with large astronomical data sets. Substantial attention has been given to ease‐of‐use and data accuracy, making it one of the most productive time series analysis software available. In this paper, we give an introduction to Peranso and its functionality. (© 2016 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

8.
This contribution aims to introduce the idea that a well‐evolved HTN of the far future, with the anticipated addition of very large apertures, could also be made to incorporate the ability to carry out photonic astronomy observations, particularly Optical VLBI in a revived Hanbury‐Brown and Twiss Intensity Interferometry (HBTII) configuration. Such an HTN could exploit its inherent rapid reconfigurational ability to become a multi‐aperture distributed photon‐counting network able to study higher‐order spatiotemporal photon correlations and provide a unique tool for direct diagnostics of astrophysical emission processes. We very briefly review various considerations associated with the switching of the HTN to a special mode in which single‐photon detection events are continuously captured for a posteriori intercorrelation. In this context, photon arrival times should be determined to the highest time resolution possible and extremely demanding absolute time keeping and absolute time distribution schemes should be devised and implemented in the HTN nodes involved. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

9.
We present a Bayesian approach to modelling galaxy clusters using multi-frequency pointed observations from telescopes that exploit the Sunyaev–Zel'dovich effect. We use the recently developed multinest technique to explore the high-dimensional parameter spaces and also to calculate the Bayesian evidence. This permits robust parameter estimation as well as model comparison. Tests on simulated Arcminute Microkelvin Imager observations of a cluster, in the presence of primary CMB signal, radio point sources (detected as well as an unresolved background) and receiver noise, show that our algorithm is able to analyse jointly the data from six frequency channels, sample the posterior space of the model and calculate the Bayesian evidence very efficiently on a single processor. We also illustrate the robustness of our detection process by applying it to a field with radio sources and primordial CMB but no cluster, and show that indeed no cluster is identified. The extension of our methodology to the detection and modelling of multiple clusters in multi-frequency SZ survey data will be described in a future work.  相似文献   

10.
The forecasting technique of the target tracking based on the short arcs at single station is an important way to guarantee that high-precision photoelectric theodolites can normally track and capture the targets in unconventional environments. We construct the tracking prediction algorithm based on nonlinear filter, which can provide the guiding data for the closed loop tracking under normal circumstances. At the same time we also construct the target prediction algorithm based on the nonlinear transformation, without valid observational data, which can provide a track guidance for the theodolite and ensure that the targets will not be lost. It is demonstrated that the nonlinear filtering is more effective than the EKF (extended Kalman filter) in the tracking prediction algorithm of the short arcs at single station. The results indicate that the nonlinear filter designed in this paper can be used as the guiding algorithm for the optical tracking equipments. And its guiding accuracy is in the same order of magnitude of the theodolite's random measurement accuracy. When the systematic error of the equipments reaches 50″, the accuracy can achieve 20″ for predictions in 60 s. This still satisfies the requirement of the field of view of the tracking equipments.  相似文献   

11.
Active galaxies     
In this paper I will review, in an unavoidably incomplete and biased way, the main results obtained by XMM‐Newton on Active Galactic Nuclei. I will then highlight the major issues still open in which XMM‐Newton can still give important contributions, expecially if the observing programs will shift in the future towards more long exposures of single objects and observations of large samples. I will also argue in favour of a legacy program consisting of good S/N observations of a flux‐limited, sizeable sample of AGN. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

12.
Temporal sampling does more than add another axis to the vector of observables. Instead, under the recognition that how objects change (and move) in time speaks directly to the physics underlying astronomical phenomena, next‐generation wide‐field synoptic surveys are poised to revolutionize our understanding of just about anything that goes bump in the night (which is just about everything at some level). Still, even the most ambitious surveys will require targeted spectroscopic follow‐up to fill in the physical details of newly discovered transients. We are now building a new system intended to ingest and classify transient phenomena in near real‐time from high‐throughput imaging data streams. Described herein, the Transient Classification Project at Berkeley will be making use of classification techniques operating on “features” extracted from time series and contextual (static) information. We also highlight the need for a community adoption of a standard representation of astronomical time series data (ie. “VOTimeseries”). (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

13.
The next generation of solar telescopes will enable us to resolve the fundamental scales of the solar atmosphere, i.e., the pressure scale height and the photon mean free path. High‐resolution observations of small‐scale structures with sizes down to 50 km require complex post‐focus instruments, which employ adaptive optics (AO) and benefit from advanced image restoration techniques. The GREGOR Fabry‐Pérot Interferometer (GFPI) will serve as an example of such an instrument to illustrate the challenges that are to be expected in instrumentation and data analysis with the next generation of solar telescopes (© 2010 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

14.
We present the online MultiDark Database – a Virtual Observatory‐oriented, relational database for hosting various cos‐mological simulations. The data is accessible via an SQL (Structured Query Language) query interface, which also allows users to directly pose scientific questions, as shown in a number of examples in this paper. Further examples for the usage of the database are given in its extensive online documentation. The database is based on the same technology as the Millennium Database, a fact that will greatly facilitate the usage of both suites of cosmological simulations. The first release of the MultiDark Database hosts two 8.6 billion particle cosmological N‐body simulations: the Bolshoi (250 h–1 Mpc simulation box, 1 h–1 kpc resolution) and MultiDark Run1 simulation (MDR1, or BigBolshoi, 1000 h–1 Mpc simulation box, 7 h–1 kpc resolution). The extraction methods for halos/subhalos from the raw simulation data, and how this data is structured in the database are explained in this paper. With the first data release, users get full access to halo/subhalo catalogs, various profiles of the halos at redshifts z = 0–15, and raw dark matter data for one time‐step of the Bolshoi and four time‐steps of the MultiDark simulation. Later releases will also include galaxy mock catalogs and additional merger trees for both simulations as well as new large volume simulations with high resolution. This project is further proof of the viability to store and present complex data using relational database technology. We encourage other simulators to publish their results in a similar manner. (© 2013 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

15.
I review the current architecture of the HTN and make three suggestions for the future. (i) We should retain the expertise split between agents which deal with the science programmes and those which deal with telescope constraints. This makes it easy to add new programmes or new telescopes. (ii) We should develop “look ahead” schedulers which attempt to schedule a whole night at once. This will give reliable calculations for the chance an observation will be carried out, and give a better chance that high priority time critical observations are successfully scheduled. (iii)We should strive to attract more science programmes to the HTN, in particular time critical observations spread over many nights, and non‐time critical work which can benefit from access to databases and the literature. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

16.
A maximum entropy method (MEM) is presented for separating the emission resulting from different foreground components from simulated satellite observations of the cosmic microwave background radiation (CMBR). In particular, the method is applied to simulated observations by the proposed Planck Surveyor satellite. The simulations, performed by Bouchet &38; Gispert, include emission from the CMBR and the kinetic and thermal Sunyaev–Zel'dovich (SZ) effects from galaxy clusters, as well as Galactic dust, free–free and synchrotron emission. We find that the MEM technique performs well and produces faithful reconstructions of the main input components. The method is also compared with traditional Wiener filtering and is shown to produce consistently better results, particularly in the recovery of the thermal SZ effect.  相似文献   

17.
This document discusses the possibility of using compressed sensing techniques for measuring 2D spectro‐polarimetric information using only one etalon and a broad prefilter. Instead of using an etalon and an extremely narrow prefilter (with all the subsequent problems of alignment), the idea is to use multiplexing techniques to include in the observations all the secondary peaks of the etalon. The reconstruction of the signal is done under the assumption that it can be efficiently reproduced in an orthogonal basis set (© 2010 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

18.
Recent ISO data have allowed, for the first time, observationally based estimates for source confusion in mid-infrared surveys. We use the extragalactic source counts from ISOCAM in conjunction with K -band counts to predict the confusion resulting from galaxies in deep mid-infrared observations. We specifically concentrate on the near-future Space Infrared Telescope Facility ( SIRTF ) mission, and calculate expected confusion for the Infrared Array Camera (IRAC) on board SIRTF . A defining scientific goal of the IRAC instrument will be the study of high-redshift galaxies using a deep, confusion-limited wide-field survey at 3–10 μm . A deep survey can reach 3-μJy sources with reasonable confidence in the shorter wavelength IRAC bands. Truly confusion-limited images with the 8 μm will be difficult to obtain because of practical time constraints, unless infrared galaxies exhibit very strong evolution beyond the deepest current observations. We find L * galaxies to be detectable to z =3–3.5 at 8 μm, which is slightly more pessimistic than found in 1999 by Simpson & Eisenhardt.  相似文献   

19.
We describe the largest data‐producing astronomy project in the coming decade – the LSST (Large Synoptic Survey Telescope). The enormous data output, database contents, knowledge discovery, and community science expected from this project will impose massive data challenges on the astronomical research community. One of these challenge areas is the rapid machine learning, data mining, and classification of all novel astronomical events from each 3‐gigapixel (6‐GB) image obtained every 20 seconds throughout every night for the project duration of 10 years.We describe these challenges and a particular implementation of a classification broker for this data fire hose. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

20.
The X‐shooter data reduction pipeline is an integral part of the X‐shooter project, it allows the production of reduced data in physical quantities from the raw data produced by the instrument. The pipeline is based on the data reduction library developed by the X‐shooter consortium with contributions from France, The Netherlands and ESO and it uses the Common Pipeline Library (CPL) developed at ESO. The pipeline has been developed for two main functions. The first function is to monitor the operation of the instrument through the reduction of the acquired data, both at Paranal, for a quick‐look control, and in Garching, for a more thorough evaluation. The second function is to allow an optimized data reduction for a scientific user. In the following I will first outline the main steps of data reduction with the pipeline then I will briefly show two examples of optimization of the results for science reduction (© 2011 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号