Processing protocole

How do we process the SPHERE data?

The HC-DC aims at providing a full reduction of all public SPHERE data, in order to make them available to the whole scientific community without restriction. Given the huge amount of data, we spent several years developping a fully automatic pipeline that provide good reduction in most cases. It works well for typical SPHERE science cases (exoplanet/disc imaging, high contrast extended source) acquired in average and good observing conditions. However, when dealing with poor or very variable observing conditions or less common science cases (notably low contrast extended source such as resolved solar system objects), the outputs provided by the HC-DC might not be of paper-grade quality and need to be carefully checked before use in publications.

The processing is done as follows : We apply the pipeline to the observations obtained for each semester. The automatic allows to process large amount of data and to ensure tracability and links between processes through the input and output lists of files. After checking that the processing reached the final step, the quality of the reduction is visually checked based on the final stacked no-ADI image. This validation step is described on the data quality page.

List of available data

Important: Do not forget the acknowledgements and reference to papers describing the reduction tools !!!

Algorithms

We provide here a short overview of the tools used to process the SPHERE data. All details can be found in the manual. The processing performs the following basic steps:

IRDIS/SPHERE and IFS/SPHERE calibration files

These level 0 data are reduced with the ESO DRH pipeline.

IRDIS and IFS SPHERE data

First release (2018-2021): The IRDIS and IFS data have been processed following different approaches. In a first release, the pre-reduction for both instruments was made with the ESO DRH pipeline with some improvements, which was followed by a frame selection step and the SpeCal routine set (Galicher et al. 2018).

New release (2023-2024) for IRDIS data: In the new reduction (performed in 2022-2024 for P95-P104), we are still using this global approach for the IRDIS data, but with a better data centering, and the addition of new keywords. Importantly, it has also undergone the visual validation step, which was not the case of the first release. All data are corrected for the astrometric offset from the North (Maire et al. 2016).

The initial release of the IFS reduction, based on a combination of the ESO DRH pipeline with inputs has also been from D. Mesa and R. Gratton. We have identified a few artefacts in some data reduction, and possibility to improve the wavelength calibration. We plan to implement a new pipeline in the near future.

This release is based on a combination of the ESO DRH pipeline, improved data centering, addition of the AO system informations, bad frame selection, level 2 processing tools gathered in the SpeCal routine set (Galicher et al. 2018) implementing the following algorithms:

  • NoADI : simple stacking of the master cubes
  • cADI : Classical Angular Differential Imaging (Marois et al. 2006, Lafrenière et al 2007)
  • TLOCI : Flavor of Angular Differential Imaging based on Locally Optimised Combination of Images (Marois et al .20011, Galicher et al 2018)
  • PCA : Flavor of Angular Differential Imaging based on Principal Component Analysis (Amara & Quanz 2012, Soummer et al. 2012)

Note : the old IRDIS release is only available in the workspace PUBLIC_reduced_science_obsolete on the client. The new IRDIS release is available in the DIVA+ data base, and in the workspace PUBLIC_reduced_science when requesting data from the client as well as on the ESO archive for a subset of outputs.

ZIMPOL/SPHERE data

The ZIMPOL data are reduced based on the Zürich pipeline (Schmid et al. 2018, manual available here). It is an IDL pipeline, adapted to the HC-DC, enabling bias subtraction, flat-fielding and polarimetric / total intensity signal extraction. All data are rotated by 2 degrees clockwise, to account for the astrometric offset from the North. No beamshift correction is applied in the automatic reduction, but it can be applied upon request. Polarimetric calibrations are applied  using the modulation-demodulation efficiencies. 

The outputs of the Zürich pipeline routines are 2 cubes, one for each camera, containing:

  • 4 frames when observing in polarisation mode: I_q, Q, I_u, U
  • 2 frames when observing in imaging mode, corresponding to even rows (with astrophysical signal) and odd rows (masked rows)

A custom Python routine is then applied to the polarimetric data, to produce polarisation maps for both cameras: I_Q (Stoke Q intensity frame), I_U (Stoke U intensity frame), I (intensity), Q (Stoke Q), U (Stoke U), PI (polarised intensity), DOLP (Degree Of Linear Polarisation), AOLP (Angle Of Linear Polarisation), U_Phi (amplitude of the polarized intensity in the direction 45° offset from the radial vector), Q_Phi (radial (0°, negative signal) and tangential (90°, positive signal) components of the linearly polarized intensity)

The routines provide stacked images. Upon request, a beamshift correction can be applied

For imaging, a custom Python routine also recenters the frame (in the absence of coronagraph) and performs a frame selection to remove outliers. The routine returns  a datacube along with the list of derotation angles to perform Angular Differential Imaging for sequences obtained in pupil stabilised mode.