The SkyMapper ASVO node is the primary access point to the SkyMapper data. The first version of the node was co-developed by the National Computing Infrastructure (NCI), Australian National University (ANU), and Intersect Australia Limited. The node has three major data releases:
- Test Data Release (TDR) – July 2015
The Test Data Release was a preview of the Survey data, designed to show their characteristics, while also testing data access methods for images and catalogues. It covers approximately 60 square degrees near RA=13h 40m and Dec=-15 deg.
- Early Data Release (EDR) – May 2016
The Early Data Release provided data from the Short Survey across one-third of the southern sky. EDR includes all fields observed between March 2014 and March 2015 where at least two visits of the telescope were made in near-photometric conditions. Each visit includes an exposure in all six filters, uvgriz.
- First Data Release (DR1) – June 2017
The First Data Release provides data from the Shallow Survey across >98% of the southern sky, mostly covering the range from the South Celestial Pole to +2 deg in declination, with some extra coverage reaching +10 deg. DR1 includes fields observed between March 2014 and September 2015, with a number of quality cuts applied. Additionally, DR1 includes measurements from over 2.3 billion detections covering over 20,200 deg2 of the sky. They correspond to ~300 million unique astrophysical objects from magnitude 8 to 18 (complete to ~17.5 mag, depending on the filter).
Data is currently accessible through Virtual Observatory-compatible tools such as TOPCAT, and Aladin. The system supports the following VO Services:
- Cone Search(including SCS)
- Image Cutouts(including SIAP)
- Full Catalogue Search(including TAP access)
More details are available on the data access page. An alternative web-interface is provided as well.
Technology currently used:
- Python Django
- Python CherryPy
- Basic JavaScript using JQuery and D3JS.
Hardware arrangement:
Currently the node is hosted on a virtualized instance at the National Computing Infrastructure (NCI).
- Database Server (8 CPUs, 32 GB).
- Virtual Node for the user interface hosting (2 CPU, and 8 GB).
- Storage is provided by NCI. For the early data release: 1.8 TB of table data (stored on an NCI virtual machine) and 19TB of compressed images (50TB uncompressed equivalent – stored on NCI’s g/data1 system). The node is expecting a data growth to approximately 8TB of table data and up to 200TB of images by the end of 2017.