Hawaii Two-0 Progress Report

%

NEP CFHT Photometery

%

NEP HSC Photometery

%

NEP Keck Spectroscopy

%

EDFF CFHT PHOTOMETERY

%

EDFF HSC PHOTOMETERY

%

EDFF Keck Spectroscopy

H20 Observing Runs

3S18A

Subaru HSC — Lost 4 nights due to earthquakes

S18B
Subaru HSC — Lost 3 nights due to earthquakes
2 successful nights (Jan 1&2)

S19A
Subaru HSC — Lost 1.25 nights due to weather, 1.25 successful nights; May 30 & 31 (2nd half), July 3-5 (2nd half)

S19B
Subaru HSC — 2 successful nightsOctober 23 & 24

Keck II DEIMOS — Lost 0.75 nights due to weather, 1.25 successful nights, August 28 & 29

S20A
Subaru HSC — Update March 24: 3 nights cancelled due to the shutdown of all MKO telescopes until the end of April due to the coronavirus pandemic,  April 17, 18, 20, 21, 26, 28 (2nd half)

Keck II DEIMOS — awarded 3 nights, June 27 & 28, July 15

S20B
Subaru HSC — 2 successful nights with HSClost 1.5 night due to weather

Keck II DEIMOS — 2 successful nights1 night lost to weather

End of LCP

S21A

CFHT MegaCam 20 hrs of queue time awarded, 16 hrs executed (PI Zalesky)

Subaru HSC — 17.4 hrs of queue time awarded, 7.19 hrs executed (PI Zalesky)

Keck II DEIMOS — 3 successful nights (PI Sanders)

S21B

CFHT MegaCam12.4 hrs of queue data obtained (PI Zalesky)

CFHT MegaCam24 hrs of queue data obtained (PI Sanders)

Subaru HSC — 28 hrs of queue time awarded, 12.6 hrs executed (PI Zalesky)

Keck II DEIMOS — 3 successful nights (PI Sanders)

S22A

CFHT MegaCam 15 hrs of queue data obtained (PI Zalesky)

Subaru HSC — 22 hrs of queue time awarded, 10 hrs executed (PI Zalesky)

Keck II DEIMOS — 3 successful nights (PI McPartland)

S22B

CFHT MegaCam30 hrs of queue data obtained (PI Sanders)

Keck I MOSFIRE — 1 successful night (PI Sanders)

Keck II DEIMOS — 3 successful nights (PI Sanders)

S23A

Subaru HSC — awarded 20 hrs of queue time (PI Zalesky)

Keck II DEIMOS —3 successful nights (PI Sanders)

 

S23B

Subaru HSC —  20 hrs of queue time awarded, 10 hrs executed (due to telescope shutdown in Nov.) (PI Sanders)

Keck I MOSFIRE — 1 successful night (PI Sanders)

Keck II DEIMOS — 3 successful nights (PI Szapudi)

 

S24A

CFHT MegaCam32.4 hrs of queue time awarded (PI Sanders)

Subaru HSC —  3.5N of queue time awarded (PI Sanders)

Keck I MOSFIRE — awarded 0.5N (PI Murphree)

HSC Coverage Maps

EDFF HSC Coverage

The solid black line indicates the boundary of the EDFF H20 coverage. Total enclosed area is ~10 square degrees.

NEP HSC Coverage

The dashed white circle indicates the boundary of the NEP H20 coverage. Total enclosed area is ~10 square degrees.

Archival Data

Through diligent searching of the archives we already have NEP HSC data from the HEROES  and AKARI-NEP projects, and CDFS data from a collection of various small projects. These archival data make up for nearly the entirety of the HSC time lost in 2018.

All data are being re-reduced at the IfA (by A. Repp & C. McPartland) using the latest photometric/astrometric calibrations from Pan-STARRS and Gaia, along with the latest version (6.7) of the HSC data reduction pipeline.

 

 

Data Reduction Progress

 

Computing
  • The HSC Data Reduction Pipeline has been successfully installed and tested on the UH Cray supercomputer
  • Co-I Chambers will ensure the team has access to the latest Pan-STARRS data for astrometric and flux calibration.
  • Generation of HSC mosaic will be done by Co-Is Repp & McPartland
  • The IRAC image mosaic pipeline of Co-I Capak has recently been optimized for cluster/supercomputer environments. The runtime of the pipeline analysis is now only a few hours, rather than a few weeks with the original version.
  • IRAC photometry will be measured using IRClean (Co-I Capak lead) that was successfully applied in COSMOS and SPLASH Laigle et al. (2016).
Extraction of Data from the Archives
  • We have discovered a significant amount of HSC data in the Subaru archive covering the two H20 fields:
    • NEP: HEROES and AKARI-NEP programs (~50% of total data needed)
    • CDF-S: various small programs (~10% of total data needed)
  • With these data in hand, we have nearly made up for the HSC time lost in 2018. See Archival Data on the Progress Page for more details.
Preliminary Tests
Purchase of Data Storage/Reduction Servers
  • We have purchased three dedicated servers for H20 data storage and reduction. This will allow us to streamline the data reduction process by minimizing latency due to data transfer.
  • Total of 364 TB of RAID storage shared by all three servers
  • Server A: 96 CPU cores, 512 Gb RAM
  • Server B: 64 CPU cores, 512 Gb RAM
  • Server C: 18 CPU cores, 512 Gb RAM

Institute for Astronomy

We are one of the largest university astronomy programs in the world.