Online:
Visits:
Stories:
Profile image
Story Views

Now:
Last Hour:
Last 24 Hours:
Total:

The Front Lines of ADASS 2014 – October 6 2014 Morning Sessions

Monday, December 8, 2014 17:31
% of readers think this story is Fact. Add your two cents.

(Before It's News)

Nick Walton spoke about data processing challenges for PLATO, ESA’s next generation planet-hunter. It is an approved ESA M3 mission with a launch date in 2024, with a nominal six year mission. Broadly speaking, it aims to detect earth sized and earth mass planets in the habitable zone of bright G-K type stars, and characterize the internal structure of these planets. PLATO will consist of  32 telescopes each with 81 MegaPixel CCD cameras, and these telescopes will survey a total field of about 2250 sq deg (48.5 x 48.5 degrees).

The Plato Data Centre (PDC) will be responsible for the development, implementation and operation of the data processing ground segment for Plato. In particular, the PDC will generate the key data products (DP) , including DP1, the science-ready light curves and centroid curves for each star, corrected for instrumental effects); DP2 transit candidates and their parameters; DP3 asteroseismic mode parameters; DP4 stellar rotation periods and stellar activity; DP5 seismically-determined stellar masses and ages, and DP6 the list of confirmed planetary systems, fully characterized by combining DP2-5 and follow-up observations. The PDC is organized into eight task areas: system architecture, (onboard) data processing algorithms, data processing development, input catalogue, ancillary data management, data analysis support tools, stellar analysis system and the exoplanet analysis system.

Nick discussed the design of the Exoplanet Analysis System (EAS), the processing chain that will create to create the Plato DPs and the final Plato output, DP6 – the fully characterized list of confirmed planetary systems. The design leverages techniques developed during the CoRoT and Kepler missions, but the processing scale is beyond thoat of these missions, and so key part of the design will leverage the expertise developed during the recently-launched Gaia mission.

Rob Simmonds talked about design ideas that are being worked on by the Square Kilometre Array (SKA) Science Data Processor (SDP) DELIV task are, whose role is to distribute products to end-users, as well as manage the distribution of data. The SKA is an international project with enormous data management and data growth challenges -the volume grows 150% per year, and daily data rates  reach 1 PB. Rob discussed  how astronomers will discover and access data projects from distributed data centers that are remote from the primary archive locations. The project is investigating the use of VO standards  for data discovery and data access by archiving the metadata at CADC. Tools for remote visualization, data movement and data management tools are under study. This work is also looking at options for the use of regional centers for performing analysis SKA data. Much of the design will de facto rely on expected continuous improvements in the performance of WANs. The project is creating an international testbed that is being created to test technologies and performance.



Source: https://astrocompute.wordpress.com/2014/10/06/the-front-lines-of-adass-2014-october-6-2014-morning-sessions/

Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

Top Stories
Recent Stories

Register

Newsletter

Email this story
Email this story

If you really want to ban this commenter, please write down the reason:

If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.