首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 421 毫秒
1.
A recently discovered solution of the three-body problem represents a new possible byproduct of binary–binary scattering. This is demonstrated by an explicit example. A way of detecting the new orbit automatically is discussed, and we give estimates related to the probability of its formation.  相似文献   

2.
3.
The general form of the surface density of an infinitely thin disc is given that generates a Sta¨ckel potential in the disc only, using formulae for the potential of elliptic and hyperbolic strings. This is useful for problems in which a simple form for the potential is important, while the corresponding surface density need only be known to check (numerically) that it is positive. A simple potential with a positive surface density is given. Also, formulae are given to calculate the surface density of such a Sta¨ckel disc, in the case in which the rotation curve is given and all the mass is concentrated in the disc.  相似文献   

4.
It is assumed that   O − C   ('observed minus calculated') values of periodic variable stars are determined by three processes, namely measurement errors, random cycle-to-cycle jitter in the period, and possibly long-term changes in the mean period. By modelling the latter as a random walk, the covariances of all   O − C   values can be calculated. The covariances can then be used to estimate unknown model parameters, and to choose between alternative models. Pseudo-residuals which could be used in model fit assessment are also defined. The theory is illustrated by four applications to spotted stars in eclipsing binaries.  相似文献   

5.
A probabilistic technique for the joint estimation of background and sources with the aim of detecting faint and extended celestial objects is described. Bayesian probability theory is applied to gain insight into the co-existence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. A multiresolution analysis is used for revealing faint and extended objects in the frame of the Bayesian mixture model. All the revealed sources are parametrized automatically providing source position, net counts, morphological parameters and their errors.
We demonstrate the capability of our method by applying it to three simulated data sets characterized by different background and source intensities. The results of employing two different prior knowledge on the source signal distribution are shown. The probabilistic method allows for the detection of bright and faint sources independently of their morphology and the kind of background. The results from our analysis of the three simulated data sets are compared with other source detection methods. Additionally, the technique is applied to ROSAT All-Sky Survey data.  相似文献   

6.
A time series is a sample of observations of well‐defined data points obtained through repeated measurements over a certain time range. The analysis of such data samples has become increasingly important not only in natural science but also in many other fields of research. Peranso offers a complete set of powerful light curve and period analysis functions to work with large astronomical data sets. Substantial attention has been given to ease‐of‐use and data accuracy, making it one of the most productive time series analysis software available. In this paper, we give an introduction to Peranso and its functionality. (© 2016 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

7.
Protocols for dealing with time‐sensitive observations have traditionally focused on robotic telescope networks and other types of automated dedicated facilities, mostly in the optical domain. Using UKIRT and JCMT as examples, which are infrared and sub‐millimetre telescopes with a traditional PI‐dominated user base, we discuss how such facilities can join a heterogeneous telescope network to their mutual advantage. (© 2006 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

8.
9.
We describe briefly the Palomar‐Quest (PQ) digital synoptic sky survey, including its parameters, data processing, status, and plans. Exploration of the time domain is now the central scientific and technological focus of the survey. To this end, we have developed a real‐time pipeline for detection of transient sources.We describe some of the early results, and lessons learned which may be useful for other, similar projects, and time‐domain astronomy in general. Finally, we discuss some issues and challenges posed by the real‐time analysis and scientific exploitation of massive data streams from modern synoptic sky surveys. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

10.
Temporal sampling does more than add another axis to the vector of observables. Instead, under the recognition that how objects change (and move) in time speaks directly to the physics underlying astronomical phenomena, next‐generation wide‐field synoptic surveys are poised to revolutionize our understanding of just about anything that goes bump in the night (which is just about everything at some level). Still, even the most ambitious surveys will require targeted spectroscopic follow‐up to fill in the physical details of newly discovered transients. We are now building a new system intended to ingest and classify transient phenomena in near real‐time from high‐throughput imaging data streams. Described herein, the Transient Classification Project at Berkeley will be making use of classification techniques operating on “features” extracted from time series and contextual (static) information. We also highlight the need for a community adoption of a standard representation of astronomical time series data (ie. “VOTimeseries”). (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

11.
We introduce a modified version of a standard power spectrum ‘peak‐bagging’ technique which is designed to gain some of the advantages that fitting the entire low‐degree p‐mode power spectrum simultaneously would bring, but without the problems involved in fitting a model incorporating many hundreds of parameters. Employing Monte‐Carlo simulations we show that by using this modified fitting code it is possible to determine the true background level in the vicinity of the p‐mode peaks. In addition to this we show how small biases in other mode parameters, which are related to inaccurate estimates of the true background, are also consequently removed. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

12.
13.
The theory of low-order linear stochastic differential equations is reviewed. Solutions to these equations give the continuous time analogues of discrete time autoregressive time-series. Explicit forms for the power spectra and covariance functions of first- and second-order forms are given. A conceptually simple method is described for fitting continuous time autoregressive models to data. Formulae giving the standard errors of the parameter estimates are derived. Simulated data are used to verify the performance of the methods. Irregularly spaced observations of the two hydrogen-deficient stars FQ Aqr and NO Ser are analysed. In the case of FQ Aqr the best-fitting model is of second order, and describes a quasi-periodicity of about 20 d with an e-folding time of 3.7 d. The NO Ser data are best fitted by a first-order model with an e-folding time of 7.2 d.  相似文献   

14.
15.
We investigate the application of neural networks to the automation of MK spectral classification. The data set for this project consists of a set of over 5000 optical (3800–5200 Å) spectra obtained from objective prism plates from the Michigan Spectral Survey. These spectra, along with their two-dimensional MK classifications listed in the Michigan Henry Draper Catalogue, were used to develop supervised neural network classifiers. We show that neural networks can give accurate spectral type classifications (σ68= 0.82 subtypes, σrms= 1.09 subtypes) across the full range of spectral types present in the data set (B2–M7). We show also that the networks yield correct luminosity classes for over 95 per cent of both dwarfs and giants with a high degree of confidence.   Stellar spectra generally contain a large amount of redundant information. We investigate the application of principal components analysis (PCA) to the optimal compression of spectra. We show that PCA can compress the spectra by a factor of over 30 while retaining essentially all of the useful information in the data set. Furthermore, it is shown that this compression optimally removes noise and can be used to identify unusual spectra.   This paper is a continuation of the work carried out by von Hippel et al. (Paper I).  相似文献   

16.
In this article we describe a case study of how NOAO is considering improving its management of Target‐of‐Opportunity (ToO) observations by integrating VOEvent into the flow of activities. We believe that using VOEvent to help document and track the use of ToO time will improve the user experience of ToOs at NOAO. It will also greatly aid in the management of the process and of the resulting data, allowing us to better track the ownership and provenance of the data and any resulting data products. Finally, it will provide an important method of archival access to the data and data “collections,” which might include not only processed data from a single VOEvent triggered observation but could also include multiple observations traceable to a single (or set of related) VOEvents. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

17.
18.
19.
This contribution aims to introduce the idea that a well‐evolved HTN of the far future, with the anticipated addition of very large apertures, could also be made to incorporate the ability to carry out photonic astronomy observations, particularly Optical VLBI in a revived Hanbury‐Brown and Twiss Intensity Interferometry (HBTII) configuration. Such an HTN could exploit its inherent rapid reconfigurational ability to become a multi‐aperture distributed photon‐counting network able to study higher‐order spatiotemporal photon correlations and provide a unique tool for direct diagnostics of astrophysical emission processes. We very briefly review various considerations associated with the switching of the HTN to a special mode in which single‐photon detection events are continuously captured for a posteriori intercorrelation. In this context, photon arrival times should be determined to the highest time resolution possible and extremely demanding absolute time keeping and absolute time distribution schemes should be devised and implemented in the HTN nodes involved. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

20.
Sodium laser guide stars (LGSs) are elongated sources due to the thickness and the finite distance of the sodium layer. The fluctuations of the sodium layer altitude and atom density profile induce errors on centroid measurements of elongated spots, and generate spurious optical aberrations in closed-loop adaptive optics (AO) systems. According to an analytical model and experimental results obtained with the University of Victoria LGS bench demonstrator, one of the main origins of these aberrations, referred to as LGS aberrations, is not the centre-of-gravity (CoG) algorithm itself, but the thresholding applied on the pixels of the image prior to computing the spot centroids. A new thresholding method, termed 'radial thresholding', is presented here, cancelling out most of the LGS aberrations without altering the centroid measurement accuracy.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号