首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
We present a Bayesian approach to modelling galaxy clusters using multi-frequency pointed observations from telescopes that exploit the Sunyaev–Zel'dovich effect. We use the recently developed multinest technique to explore the high-dimensional parameter spaces and also to calculate the Bayesian evidence. This permits robust parameter estimation as well as model comparison. Tests on simulated Arcminute Microkelvin Imager observations of a cluster, in the presence of primary CMB signal, radio point sources (detected as well as an unresolved background) and receiver noise, show that our algorithm is able to analyse jointly the data from six frequency channels, sample the posterior space of the model and calculate the Bayesian evidence very efficiently on a single processor. We also illustrate the robustness of our detection process by applying it to a field with radio sources and primordial CMB but no cluster, and show that indeed no cluster is identified. The extension of our methodology to the detection and modelling of multiple clusters in multi-frequency SZ survey data will be described in a future work.  相似文献   

2.
A new fast Bayesian approach is introduced for the detection of discrete objects immersed in a diffuse background. This new method, called PowellSnakes, speeds up traditional Bayesian techniques by (i) replacing the standard form of the likelihood for the parameters characterizing the discrete objects by an alternative exact form that is much quicker to evaluate; (ii) using a simultaneous multiple minimization code based on Powell's direction set algorithm to locate rapidly the local maxima in the posterior and (iii) deciding whether each located posterior peak corresponds to a real object by performing a Bayesian model selection using an approximate evidence value based on a local Gaussian approximation to the peak. The construction of this Gaussian approximation also provides the covariance matrix of the uncertainties in the derived parameter values for the object in question. This new approach provides a speed up in performance by a factor of '100' as compared to existing Bayesian source extraction methods that use Monte Carlo Markov chain to explore the parameter space, such as that presented by Hobson & McLachlan. The method can be implemented in either real or Fourier space. In the case of objects embedded in a homogeneous random field, working in Fourier space provides a further speed up that takes advantage of the fact that the correlation matrix of the background is circulant. We illustrate the capabilities of the method by applying to some simplified toy models. Furthermore, PowellSnakes has the advantage of consistently defining the threshold for acceptance/rejection based on priors which cannot be said of the frequentist methods. We present here the first implementation of this technique (version I). Further improvements to this implementation are currently under investigation and will be published shortly. The application of the method to realistic simulated Planck observations will be presented in a forthcoming publication.  相似文献   

3.
We use the billion-particle Hubble Volume simulations to make statistical predictions for the distribution of galaxy clusters that will be observed by the Planck Surveyor satellite through their effect on the cosmic microwave background – the Sunyaev–Zel'dovich (SZ) effect. We utilize the lightcone data sets for both critical density ( τ CDM) and flat low-density (ΛCDM) cosmologies: a 'full-sky' survey out to z ∼0.5 , two 'octant' data sets out to beyond z =1 , and a 100 square degree data set extending to z ∼4 . Making simple, but robust, assumptions regarding both the thermodynamic state of the gas and the detection of objects against an unresolved background, we present the expected number of SZ sources as a function of redshift and angular size, and also as a function of flux (for both the thermal and kinetic effects) for three of the relevant High Frequency Instrument frequency channels. We confirm the expectation that the Planck Surveyor will detect around 5×104 clusters, though the exact number is sensitive to the choice of several parameters including the baryon fraction, and also to the cluster density profile, so that either cosmology may predict more clusters. We also find that the majority of detected sources should be at z <1.5 , and we estimate that around 1 per cent of clusters will be spatially resolved by the Planck Surveyor , though this has a large uncertainty.  相似文献   

4.
We present a careful analysis of the point-source detection limit of the AKARI All-Sky Survey in the WIDE-S 90-μm band near the North Ecliptic Pole (NEP). Timeline analysis is used to detect IRAS ( Infrared Astronomy Satellite ) sources and then a conversion factor is derived to transform the peak timeline signal to the interpolated 90-μm flux of a source. Combined with a robust noise measurement, the point-source flux detection limit at signal-to-noise ratio  (S/N) > 5  for a single detector row is  1.1 ± 0.1 Jy  which corresponds to a point-source detection limit of the survey of ∼0.4 Jy.
Wavelet transform offers a multiscale representation of the Time Series Data ( tsd ). We calculate the continuous wavelet transform of the tsd and then search for significant wavelet coefficients considered as potential source detections. To discriminate real sources from spurious or moving objects, only sources with confirmation are selected. In our multiscale analysis, IRAS sources selected above 4σ can be identified as the only real sources at the Point Source Scales. We also investigate the correlation between the non- IRAS sources detected in timeline analysis and cirrus emission using wavelet transform and contour plots of wavelet power spectrum. It is shown that the non- IRAS sources are most likely to be caused by excessive noise over a large range of spatial scales rather than real extended structures such as cirrus clouds.  相似文献   

5.
We have surveyed 188 ROSAT Position Sensitive Proportional Counter (PSPC) fields for X-ray sources with hard spectra ( α <0.5); such sources must be major contributors to the X-ray background at faint fluxes. In this paper we present optical identifications for 62 of these sources: 28 active galactic nuclei (AGN) which show broad lines in their optical spectra (BLAGN), 13 narrow emission line galaxies (NELGs), five galaxies with no visible emission lines, eight clusters and eight Galactic stars.
The BLAGN, NELGs and galaxies have similar distributions of X-ray flux and spectra. Their ROSAT spectra are consistent with their being AGN obscured by columns of 20.5< log( N H/cm−2)<23 . The hard spectrum BLAGN have a distribution of X-ray to optical ratios which is similar to that found for AGN from soft X-ray surveys (1< α OX<2) . However, a relatively large proportion (15 per cent) of the BLAGN, NELGs and galaxies are radio loud. This could be because the radio jets in these objects produce intrinsically hard X-ray emission, or if their hardness is caused by absorption, it could be because radio-loud objects are more X-ray luminous than radio-quiet objects. The eight hard sources identified as clusters of galaxies are the brightest, and softest group of sources and hence clusters are unlikely to be an important component of the hard, faint population.
We propose that BLAGN are likely to constitute a significant fraction of the faint, hard, 0.5–2 keV population and could be important to reproducing the shape of the X-ray background, because they are the most numerous type of object in our sample (comprising almost half the identified sources), and because all our high redshift ( z >1) identified hard sources have broad lines.  相似文献   

6.
A speedy pixon algorithm for image reconstruction is described. Two applications of the method to simulated astronomical data sets are also reported. In one case, galaxy clusters are extracted from multiwavelength microwave sky maps using the spectral dependence of the Sunyaev–Zel'dovich effect to distinguish them from the microwave background fluctuations and the instrumental noise. The second example involves the recovery of a sharply peaked emission profile, such as might be produced by a galaxy cluster observed in X-rays. These simulations show the ability of the technique both to detect sources in low signal-to-noise ratio data and to deconvolve a telescope beam in order to recover the internal structure of a source.  相似文献   

7.
Extracting sources with low signal-to-noise ratio (S/N) from maps with structured background is a non-trivial task which has become important in studying the faint end of the submillimetre (submm) number counts. In this paper, we study the source extraction from submm jiggle-maps from the Submillimetre Common-User Bolometer Array (SCUBA) using the Mexican hat wavelet (MHW), an isotropic wavelet technique. As a case study, we use a large (11.8-arcmin2) jiggle-map of the galaxy cluster Abell 2218 (A2218), with a 850-μm 1σ rms sensitivity of 0.6–1 mJy. We show via simulations that MHW is a powerful tool for the reliable extraction of low-S/N sources from the SCUBA jiggle-maps and nine sources are detected in the A2218 850-μm image. Three of these sources are identified as images of a single background source with an unlensed flux of 0.8 mJy. Further, two single-imaged sources also have unlensed fluxes <2 mJy, below the blank-field confusion limit. In this ultradeep map, the individual sources detected resolve nearly all of the extragalactic background light at 850 μm, and the deep data allow to put an upper limit of 44 sources arcmin−2 to 0.2 mJy at 850 μm.  相似文献   

8.
Wide-field radio interferometric images often contain a large population of faint compact sources. Due to their low intensity/noise ratio, these objects can be easily missed by automated detection methods, which have been classically based on thresholding techniques after local noise estimation. The aim of this paper is to present and analyse the performance of several alternative or complementary techniques to thresholding. We compare three different algorithms to increase the detection rate of faint objects. The first technique consists of combining wavelet decomposition with local thresholding. The second technique is based on the structural behaviour of the neighbourhood of each pixel. Finally, the third algorithm uses local features extracted from a bank of filters and a boosting classifier to perform the detections. The methods’ performances are evaluated using simulations and radio mosaics from the Giant Metrewave Radio Telescope and the Australia Telescope Compact Array. We show that the new methods perform better than well-known state of the art methods such as SExtractor, SAD and DUCHAMP at detecting faint sources of radio interferometric images.  相似文献   

9.
10.
In order to improve the ability to find the faint and small celestial bodies in the solar system, a method of shifting and stacking images which improves the detection efficiency of faint moving objects is applied to process the sequential optical images. This method determines the existence of moving objects by using the method of false position to pre-estimate the apparent velocities of moving objects, then determines iteratively the accurate positions of moving objects based on the SNR (Signal-to-Noise Ratio) and elongation of stellar image. Using the sequential images of the China Near Earth Object Survey Telescope (CNEOST), we carry out an experiment and succeed in detecting asteroids fainter than 21 magnitude which are invisible on a single image. Thus, the feasibility of this method is verified.  相似文献   

11.
As the quality of the available galaxy cluster data improves, the models fitted to these data might be expected to become increasingly complex. Here we present the Bayesian approach to the problem of cluster data modelling: starting from simple, physically motivated parametrized functions to describe the cluster's gas density, gravitational potential and temperature, we explore the high-dimensional parameter spaces with a Markov-Chain Monte Carlo sampler, and compute the Bayesian evidence in order to make probabilistic statements about the models tested. In this way sufficiently good data will enable the models to be distinguished, enhancing our astrophysical understanding; in any case the models may be marginalized over in the correct way when estimating global, perhaps cosmological, parameters. In this work we apply this methodology to two sets of simulated interferometric Sunyaev–Zel'dovich effect and gravitational weak lensing data, corresponding to current and next-generation telescopes. We calculate the expected precision on the measurement of the cluster gas fraction from such experiments, and investigate the effect of the primordial cosmic microwave background (CMB) fluctuations on their accuracy. We find that data from instruments such as the Arcminute Microkelvin Imager (AMI), when combined with wide-field ground-based weak lensing data, should allow both cluster model selection and estimation of gas fractions to a precision of better than 30 per cent for a given cluster.  相似文献   

12.
The SCUBA instrument on the James Clerk Maxwell Telescope has already had an impact on cosmology by detecting relatively large numbers of dusty galaxies at high redshift. Apart from identifying well-detected sources, such data can also be mined for information about fainter sources and their correlations, as revealed through low-level fluctuations in SCUBA maps. As a first step in this direction, we analyse a small SCUBA data set as if it were obtained from a cosmic microwave background (CMB) differencing experiment. This enables us to place limits on CMB anisotropy at 850 m. Expressed as Q flat, the quadrupole expectation value for a flat power spectrum, the limit is 152 K at 95 per cent confidence, corresponding to     (or T T <14105) for a Gaussian autocorrelation function, with a coherence angle of about 2025 arcsec. These results could easily be reinterpreted in terms of any other fluctuating sky signal. This is currently the best limit for these scales at high frequency, and comparable to limits at similar angular scales in the radio. Even with such a modest data set, it is possible to put a constraint on the slope of the SCUBA counts at the faint end, since even randomly distributed sources would lead to fluctuations. Future analysis of sky correlations in more extensive data sets ought to yield detections, and hence additional information on source counts and clustering.  相似文献   

13.
We present an analysis of quasar variability from data collected during a photometric monitoring of 50 objects carried out at CNPq/Laboratório Nacional de Astrofísicá, Brazil, between 1993 March and 1996 July. A distinctive feature of this survey is its photometric accuracy, ∼0.02  V  mag, achieved through differential photometry with CCD detectors, which allows the detection of faint levels of variability. We find that the relative variability, δ σ L , observed in the V band is anticorrelated with both luminosity and redshift, although we have no means of discovering the dominant relation, given the strong coupling between luminosity and redshift for the objects in our sample. We introduce a model for the dependence of quasar variability on frequency that is consistent with multiwavelength observations of the nuclear variability of the Seyfert galaxy NGC 4151. We show that correcting the observed variability for this effect slightly increases the significance of the trends of variability with luminosity and redshift. Assuming that variability depends only on the luminosity, we show that the corrected variability is anticorrelated with luminosity and is in good agreement with predictions of a simple Poissonian model. The energy derived for the hypothetical pulses, ∼1050 erg, agrees well with those obtained in other studies. We also find that the radio-loud objects in our sample tend to be more variable than the radio-quiet ones, for all luminosities and redshifts.  相似文献   

14.
Observations of present and future X‐ray telescopes include a large number of ipitous sources of unknown types. They are a rich source of knowledge about X‐ray dominated astronomical objects, their distribution, and their evolution. The large number of these sources does not permit their individual spectroscopical follow‐up and classification. Here we use Chandra Multi‐Wavelength public data to investigate a number of statistical algorithms for classification of X‐ray sources with optical imaging follow‐up. We show that up to statistical uncertainties, each class of X‐ray sources has specific photometric characteristics that can be used for its classification. We assess the relative and absolute performance of classification methods and measured features by comparing the behaviour of physical quantities for statistically classified objects with what is obtained from spectroscopy. We find that among methods we have studied, multi‐dimensional probability distribution is the best for both classifying source type and redshift, but it needs a sufficiently large input (learning) data set. In absence of such data, a mixture of various methods can give a better final result.We discuss some of potential applications of the statistical classification and the enhancement of information obtained in this way. We also assess the effect of classification methods and input data set on the astronomical conclusions such as distribution and properties of X‐ray selected sources. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

15.
We present the final analysis of the European Large Area Infrared Space Observatory ( ISO ) Survey (ELAIS) 15-μm observations, carried out with the ISO Camera (ISOCAM) instrument on board the ISO .
The data-reduction method, known as the Lari Method, is based on a mathematical model of the behaviour of the detector and was specifically designed for the detection of faint sources in ISOCAM/ISO Photopolarimeter (ISOPHOT) data. The method is fully interactive and leads to very reliable and complete source lists.
The resulting catalogue includes 1923 sources detected with signal-to-noise ratio of  > 5  in the 0.5–100 mJy flux range and over an area of 10.85 deg2 split into four fields, making it the largest non-serendipitous extragalactic source catalogue obtained to date from the ISO data.
This paper presents the concepts underlying the data-reduction method together with its latest enhancements. The data-reduction process, the production and basic properties of the resulting catalogue are discussed. The catalogue quality is assessed by means of detailed simulations, optical identifications and comparison with previous analyses.  相似文献   

16.
The quantitative spectroscopy of stellar objects in complex environments is mainly limited by the ability of separating the object from the background. Standard slit spectroscopy, restricting the field of view to one dimension, is obviously not the proper technique in general. The emerging Integral Field (3D) technique with spatially resolved spectra of a two‐dimensional field of view provides a great potential for applying advanced subtraction methods. In this paper an image reconstruction algorithm to separate point sources and a smooth background is applied to 3D data. Several performance tests demonstrate the photometric quality of the method. The algorithm is applied to real 3D observations of a sample Planetary Nebula in M31, whose spectrum is contaminated by the bright and complex galaxy background. The ability of separating sources is also studied in a crowded field in M33. (© 2004 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

17.
18.
We developed a source detection algorithm based on the Minimal Spanning Tree (MST), that is a graph-theoretical method useful for finding clusters in a given set of points. This algorithm is applied to γ-ray bi-dimensional images where the points correspond to the arrival direction of photons, and the possible sources are associated with the regions where they clusterize. Some filters to select these clusters and to reduce the spurious detections are introduced. An empirical study of the statistical properties of MST on random fields is carried out in order to derive some criteria to estimate the best filter values. We also introduce two parameters useful to verify the goodness of candidate sources. To show how the MST algorithm works in practice, we present an application to an EGRET observation of the Virgo field, at high Galactic latitude and with a low and rather uniform background, in which several sources are detected.  相似文献   

19.
An analysis of the spatial fluctuations in 15 deep ASCA SIS0 images has been conducted in order to probe the 2–10 keV X-ray source counts down to a flux limit ∼ 2 × 10−14 erg cm−2 s−1. Special care has been taken in modelling the fluctuations in terms of the sensitivity maps of every one of the 16 regions (5.6 × 5.6 arcmin2 each) into which the SIS0 has been divided, by means of ray-tracing simulations with improved optical constants in the X-ray telescope. The very extended 'sidelobes' (extending up to a couple of degrees) exhibited by these sensitivity maps make our analysis sensitive to both faint on-axis sources and brighter off-axis ones, the former being dominant. The source counts in the range (2−12) × 10−14 erg cm−2 s−1 are found to be close to a Euclidean form which extrapolates well to previous results from higher fluxes and are in reasonable agreement with some recent ASCA surveys. However, our results disagree with the deep survey counts by Georgantopoulos et al. The possibility that the source counts flatten to a sub-Euclidean form, as is observed at soft energies in ROSAT data, is only weakly constrained to happen at a flux < 1.8 × 10−12 erg cm−2 s−1 (90 per cent confidence). Down to the sensitivity limit of our analysis, the integrated contribution of the sources the imprint of which is seen in the fluctuations amounts to ∼ 35 ± 13 per cent of the extragalactic 2–10 keV X-ray background.  相似文献   

20.
We discuss the technique of Wide-field imaging as it applies to Very Long Baseline Interferometry (VLBI). In the past VLBI data sets were usually averaged so severely that the field-of-view was typically restricted to regions extending a few hundred milliarcseconds from the phase centre of the field. Recent advances in data analysis techniques, together with increasing data storage capabilities, and enhanced computer processing power, now permit VLBI images to be made whose angular size represents a significant fraction of an individual antenna's primary beam. This technique has recently been successfully applied to several large separation gravitational lens systems, compact Supernova Remnants in the starburst galaxy M82, and two faint radio sources located within the same VLA FIRST field. It seems likely that other VLBI observing programmes might benefit from this wide-field approach to VLBI data analysis. With the raw sensitivity of global VLBI set to improve by a factor 4–5 over the coming few years, the number of sources that can be detected in a given field will rise considerably. In addition, a continued progression in VLBI's ability to image relatively faint and extended low brightness temperature features (such as hot-spots in large-scale astrophysical jets) is also to be expected. As VLBI sensitivity approaches the μJy level, a wide-field approach to data analysis becomes inevitable.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号