XMM-LSS master catalogue¶

This notebook presents the merge of the various pristine catalogues to produce the HELP master catalogue on XMM-LSS.

In [1]:
from herschelhelp_internal import git_version
print("This notebook was run with herschelhelp_internal version: \n{}".format(git_version()))
This notebook was run with herschelhelp_internal version: 
33f5ec7 (Wed Dec 6 16:56:17 2017 +0000)
In [2]:
%matplotlib inline
#%config InlineBackend.figure_format = 'svg'

import matplotlib.pyplot as plt
plt.rc('figure', figsize=(10, 6))

import os
import time

from astropy import units as u
from astropy.coordinates import SkyCoord
from astropy.table import Column, Table
import numpy as np
from pymoc import MOC

from herschelhelp_internal.masterlist import merge_catalogues, nb_merge_dist_plot, specz_merge
from herschelhelp_internal.utils import coords_to_hpidx, ebv, gen_help_id, inMoc
In [3]:
TMP_DIR = os.environ.get('TMP_DIR', "./data_tmp")
OUT_DIR = os.environ.get('OUT_DIR', "./data")
SUFFIX = os.environ.get('SUFFIX', time.strftime("_%Y%m%d"))

try:
    os.makedirs(OUT_DIR)
except FileExistsError:
    pass

I - Reading the prepared pristine catalogues¶

In [4]:
#candels = Table.read("{}/CANDELS.fits".format(TMP_DIR))           # 1.1
#cfht_wirds = Table.read("{}/CFHT-WIRDS.fits".format(TMP_DIR))     # 1.3
#cfhtls_wide = Table.read("{}/CFHTLS-WIDE.fits".format(TMP_DIR))   # 1.4a
#cfhtls_deep = Table.read("{}/CFHTLS-DEEP.fits".format(TMP_DIR))   # 1.4b
#We no longer use CFHTLenS as it is the same raw data set as CFHTLS-WIDE
# cfhtlens = Table.read("{}/CFHTLENS.fits".format(TMP_DIR))         # 1.5
#decals = Table.read("{}/DECaLS.fits".format(TMP_DIR))             # 1.6
#servs = Table.read("{}/SERVS.fits".format(TMP_DIR))               # 1.8
#swire = Table.read("{}/SWIRE.fits".format(TMP_DIR))               # 1.7
#hsc_wide = Table.read("{}/HSC-WIDE.fits".format(TMP_DIR))         # 1.9a
#hsc_deep = Table.read("{}/HSC-DEEP.fits".format(TMP_DIR))         # 1.9b
#hsc_udeep = Table.read("{}/HSC-UDEEP.fits".format(TMP_DIR))       # 1.9c
#ps1 = Table.read("{}/PS1.fits".format(TMP_DIR))                   # 1.10
#sxds = Table.read("{}/SXDS.fits".format(TMP_DIR))                 # 1.11
#sparcs = Table.read("{}/SpARCS.fits".format(TMP_DIR))             # 1.12
dxs = Table.read("{}/UKIDSS-DXS.fits".format(TMP_DIR))            # 1.13
uds = Table.read("{}/UKIDSS-UDS.fits".format(TMP_DIR))            # 1.14
#vipers = Table.read("{}/VIPERS.fits".format(TMP_DIR))             # 1.15
#vhs = Table.read("{}/VISTA-VHS.fits".format(TMP_DIR))             # 1.16
#video = Table.read("{}/VISTA-VIDEO.fits".format(TMP_DIR))         # 1.17
#viking = Table.read("{}/VISTA-VIKING.fits".format(TMP_DIR))       # 1.18

II - Merging tables¶

We first merge the optical catalogues and then add the infrared ones. We start with PanSTARRS because it coevrs the whole field.

At every step, we look at the distribution of the distances separating the sources from one catalogue to the other (within a maximum radius) to determine the best cross-matching radius.

Start with DXS¶

In [5]:
master_catalogue = dxs
master_catalogue['dxs_ra'].name = 'ra'
master_catalogue['dxs_dec'].name = 'dec'

Add UDS¶

In [6]:
nb_merge_dist_plot(
    SkyCoord(master_catalogue['ra'], master_catalogue['dec']),
    SkyCoord(uds['uds_ra'], uds['uds_dec'])
)
HELP Warning: There weren't any cross matches. The two surveys probably don't overlap.
In [7]:
# Given the graph above, we use 0.8 arc-second radius
master_catalogue = merge_catalogues(master_catalogue, uds, "uds_ra", "uds_dec", radius=0.8*u.arcsec)

Cleaning¶

When we merge the catalogues, astropy masks the non-existent values (e.g. when a row comes only from a catalogue and has no counterparts in the other, the columns from the latest are masked for that row). We indicate to use NaN for masked values for floats columns, False for flag columns and -1 for ID columns.

In [8]:
for col in master_catalogue.colnames:
    if "m_" in col or "merr_" in col or "f_" in col or "ferr_" in col or "stellarity" in col:
        master_catalogue[col] = master_catalogue[col].astype(float)
        master_catalogue[col].fill_value = np.nan
    elif "flag" in col:
        master_catalogue[col].fill_value = 0
    elif "id" in col:
        master_catalogue[col].fill_value = -1
        
master_catalogue = master_catalogue.filled()
In [9]:
#Since this is not the final merged catalogue. We rename column names to make them unique
master_catalogue['ra'].name = 'ukidss_ra'
master_catalogue['dec'].name = 'ukidss_dec'
master_catalogue['flag_merged'].name = 'ukidss_flag_merged'
In [10]:
master_catalogue[:10].show_in_notebook()
Out[10]:
<Table length=10>
idxdxs_idukidss_raukidss_decm_ap_ukidss_jmerr_ap_ukidss_jm_ukidss_jmerr_ukidss_jm_ap_ukidss_kmerr_ap_ukidss_km_ukidss_kmerr_ukidss_kdxs_stellarityf_ap_ukidss_jferr_ap_ukidss_jf_ukidss_jferr_ukidss_jflag_ukidss_jf_ap_ukidss_kferr_ap_ukidss_kf_ukidss_kferr_ukidss_kflag_ukidss_kdxs_flag_cleaneddxs_flag_gaiaukidss_flag_mergeduds_idm_ap_uds_jmerr_ap_uds_jm_uds_jmerr_uds_jm_ap_uds_hmerr_ap_uds_hm_uds_hmerr_uds_hm_ap_uds_kmerr_ap_uds_km_uds_kmerr_uds_kuds_stellarityf_ap_uds_jferr_ap_uds_jf_uds_jferr_uds_jflag_uds_jf_ap_uds_hferr_ap_uds_hf_uds_hferr_uds_hflag_uds_hf_ap_uds_kferr_ap_uds_kf_uds_kferr_uds_kflag_uds_kuds_flag_cleaneduds_flag_gaia
degdeg
044667966234636.0145895893-4.191437097110.9735317230.00024036558170310.35740184780.00015687377890611.65284061430.00025849608937311.02360820770.0001702379522610.993865013123148110.95312532.7895240784261239.64062537.7454948425False79225.242187518.8622398376141434.89062522.1762771606FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0
144667973100536.026631441-4.3727605461511.15699100490.00025933678261911.43998527530.00026043999241711.7687816620.00027028596377912.45943355560.000349828798790.993865013123125084.47656229.877428054896384.164062523.1200656891False71201.179687517.725006103537690.027343812.1438865662FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0
244667966223035.8481298196-4.1807923877911.15768718720.00026105419965510.03880310060.00013606280845111.90334606170.00028928663232410.80978488920.0001569781888970.993865013123125004.30468830.0560092926350331.043.9029502869False62901.570312516.7596740723172220.92187524.9000873566FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0
344667974759837.122300951-4.2322136753511.24153804780.00026159934350310.55602836610.00017381396901311.86410522460.00027095957193513.15462303160.0004658407415260.993865013123115713.66406227.8802680969217564.73437534.8296318054False65216.542968816.275634765619867.84179698.52439975739FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0
444667969679935.156308275-3.558384153711.30477046970.00026342787896311.33130455020.00024418137036311.76084804530.00027437217067911.0649566650.0001773624680940.993865013123109167.07031226.4867744446106531.523.9588661194False71723.37518.1249332428136149.85937522.2410144806FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0
544667972348835.5558199274-4.3799029034711.18826675420.00026513068587510.95164680480.00020772257994411.94219493870.00028894646675311.61222171780.0002301594940950.993865013123121532.69531229.6776008606151126.687528.9134922028False60690.6562516.151576995882245.320312517.4347515106FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0
644667970131035.1257897114-4.1824616723811.32778739930.00026796548627311.12087059020.00022304209414911.85601425170.00028440507594512.14813709260.00030038409750.993865013123106877.1562526.3778514862129315.80468826.5652637482False65704.35937517.211042404250204.769531213.8898506165FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0
744667982751335.7965920392-4.6056167726211.32069301610.00027628475800211.20767688750.00023217441048511.96658706670.00029244148754513.05736160280.0004931429284620.993865013123107577.79687527.3750705719119379.32031225.528137207False59342.386718815.983787536621729.77539069.86969470978FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0
844667977233636.4279976703-4.7560139242611.3724231720.00027904156013411.16037464140.00022925634402812.11246871950.00030217246967413.97854995730.0007537961937490.993865013123102572.41406226.3618049622124695.27343826.3297672272False51881.472656214.43919181829302.074218756.45816898346FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0
944667965752535.9637205717-3.6126934003111.33276271820.00027980859158610.58789920810.00017428546561911.87590503690.00028719662805111.75718498230.0002452191547490.993865013123106388.51562527.4177284241211271.1562533.9138450623False64511.597656217.064470291171965.757812516.2538375854FalseFalse3False-1nannannannannannannannannannannannannannannannannanFalsenannannannanFalsenannannannanFalseFalse0

V - Adding unique identifier¶

In [11]:
master_catalogue.add_column(Column(data=(np.char.array(master_catalogue['dxs_id'].astype(str)) 
                                    +  np.char.array(master_catalogue['uds_id'].astype(str) )), 
                              name="ukidss_intid"))
In [12]:
id_names = []
for col in master_catalogue.colnames:
    if '_id' in col:
        id_names += [col]
    if '_intid' in col:
        id_names += [col]
        
print(id_names)
['dxs_id', 'uds_id', 'ukidss_intid']

VII - Choosing between multiple values for the same filter¶

VII. d UKIDSS DXS and UDS¶

There is no overlap between UDS and DXS so I simply merge the two columns.

In [13]:
#Band H is only in UDS so we can simply rename it
for col in master_catalogue.colnames:
    if 'uds_h' in col:
        master_catalogue[col].name = col.replace('uds_h', 'ukidss_h')
        
has_uds_k =     ~np.isnan(master_catalogue['f_uds_k'])
has_uds_j =     ~np.isnan(master_catalogue['f_uds_j'])
has_ukidss_k =  ~np.isnan(master_catalogue['f_ukidss_k'])
has_ukidss_j =  ~np.isnan(master_catalogue['f_ukidss_j'])

master_catalogue['f_ukidss_k'][has_uds_k] = master_catalogue['f_uds_k'][has_uds_k]
master_catalogue['ferr_ukidss_k'][has_uds_k] = master_catalogue['ferr_uds_k'][has_uds_k]
master_catalogue['m_ukidss_k'][has_uds_k] = master_catalogue['m_uds_k'][has_uds_k]
master_catalogue['merr_ukidss_k'][has_uds_k] = master_catalogue['merr_uds_k'][has_uds_k]
master_catalogue['flag_ukidss_k'][has_uds_k] = master_catalogue['flag_uds_k'][has_uds_k]

master_catalogue['f_ukidss_j'][has_uds_j] = master_catalogue['f_uds_j'][has_uds_j]
master_catalogue['ferr_ukidss_j'][has_uds_j] = master_catalogue['ferr_uds_j'][has_uds_j]
master_catalogue['m_ukidss_j'][has_uds_j] = master_catalogue['m_uds_j'][has_uds_j]
master_catalogue['merr_ukidss_j'][has_uds_j] = master_catalogue['merr_uds_j'][has_uds_j]
master_catalogue['flag_ukidss_j'][has_uds_j] = master_catalogue['flag_uds_j'][has_uds_j]

has_ap_uds_k =  ~np.isnan(master_catalogue['f_ap_uds_k'])
has_ap_uds_j =  ~np.isnan(master_catalogue['f_ap_uds_j'])
has_ap_ukidss_k =  ~np.isnan(master_catalogue['f_ap_ukidss_k'])
has_ap_ukidss_j =  ~np.isnan(master_catalogue['f_ap_ukidss_j'])

master_catalogue['f_ap_ukidss_k'][has_ap_uds_k] = master_catalogue['f_ap_uds_k'][has_ap_uds_k]
master_catalogue['ferr_ap_ukidss_k'][has_ap_uds_k] = master_catalogue['ferr_ap_uds_k'][has_ap_uds_k]
master_catalogue['m_ap_ukidss_k'][has_ap_uds_k] = master_catalogue['m_ap_uds_k'][has_ap_uds_k]
master_catalogue['merr_ap_ukidss_k'][has_ap_uds_k] = master_catalogue['merr_ap_uds_k'][has_ap_uds_k]

master_catalogue['f_ap_ukidss_j'][has_ap_uds_j] = master_catalogue['f_ap_uds_j'][has_ap_uds_j]
master_catalogue['ferr_ap_ukidss_j'][has_ap_uds_j] = master_catalogue['ferr_ap_uds_j'][has_ap_uds_j]
master_catalogue['m_ap_ukidss_j'][has_ap_uds_j] = master_catalogue['m_ap_uds_j'][has_ap_uds_j]
master_catalogue['merr_ap_ukidss_j'][has_ap_uds_j] = master_catalogue['merr_ap_uds_j'][has_ap_uds_j]

master_catalogue.remove_columns(['f_uds_j','ferr_uds_j','m_uds_j','merr_uds_j','flag_uds_j',
                               'f_uds_k','ferr_uds_k','m_uds_k','merr_uds_k','flag_uds_k',
                               'f_ap_uds_j','ferr_ap_uds_j','m_ap_uds_j','merr_ap_uds_j',
                               'f_ap_uds_k','ferr_ap_uds_k','m_ap_uds_k','merr_ap_uds_k'])


ukidss_origin = Table()
ukidss_origin.add_column(master_catalogue['ukidss_intid'])
origin = np.full(len(master_catalogue), '     ', dtype='<U5')
origin[has_uds_k] = "UDS"
origin[has_ukidss_k] = "DXS"
ukidss_origin.add_column(Column(data=origin, name= 'f_ukidss_k' ))
origin = np.full(len(master_catalogue), '     ', dtype='<U5')
origin[has_uds_j] = "UDS"
origin[has_ukidss_j] = "DXS"
ukidss_origin.add_column(Column(data=origin, name= 'f_ukidss_j' ))
origin_ap = np.full(len(master_catalogue), '     ', dtype='<U5')
origin_ap[has_ap_uds_k] = "UDS"
origin_ap[has_ap_ukidss_k] = "DXS"
ukidss_origin.add_column(Column(data=origin_ap, name= 'f_ap_ukidss_k' ))
origin_ap = np.full(len(master_catalogue), '     ', dtype='<U5')
origin_ap[has_ap_uds_j] = "UDS"
origin_ap[has_ap_ukidss_j] = "DXS"
ukidss_origin.add_column(Column(data=origin_ap, name= 'f_ap_ukidss_j' ))

ukidss_origin.write("{}/xmm-lss_ukidss_fluxes_origins{}.fits".format(OUT_DIR, SUFFIX), overwrite=True)

IX - Cross-identification table¶

We are producing a table associating to each HELP identifier, the identifiers of the sources in the pristine catalogues. This can be used to easily get additional information from them.

For convenience, we also cross-match the master list with the SDSS catalogue and add the objID associated with each source, if any. TODO: should we correct the astrometry with respect to Gaia positions?

XI - Saving the catalogue¶

In [14]:
columns = ["help_id", "field", "ra", "dec", "hp_idx"]

bands = [column[5:] for column in master_catalogue.colnames if 'f_ap' in column]
for band in bands:
    columns += ["f_ap_{}".format(band), "ferr_ap_{}".format(band),
                "m_ap_{}".format(band), "merr_ap_{}".format(band),
                "f_{}".format(band), "ferr_{}".format(band),
                "m_{}".format(band), "merr_{}".format(band),
                "flag_{}".format(band)]    
    
columns += ["stellarity", "stellarity_origin", "flag_cleaned", "flag_merged", "flag_gaia", "flag_optnir_obs", "flag_optnir_det", 
            "zspec", "zspec_qual", "zspec_association_flag", "ebv"]
In [15]:
# We check for columns in the master catalogue that we will not save to disk.
print("Missing columns: {}".format(set(master_catalogue.colnames) - set(columns)))
Missing columns: {'uds_flag_gaia', 'ukidss_ra', 'uds_id', 'dxs_stellarity', 'ukidss_dec', 'dxs_flag_cleaned', 'uds_flag_cleaned', 'ukidss_flag_merged', 'ukidss_intid', 'dxs_flag_gaia', 'uds_stellarity', 'dxs_id'}
In [16]:
master_catalogue.write("{}/ukidss_merged_catalogue_xmm-lss.fits".format(TMP_DIR), overwrite = True)