1. LargeSynopticSurveyTelescope(LSST)DataManagementTestPlan
  2. WilliamO’Mullane,MarioJuric,FrossieEconomou
  3. LDM-503
  4. LatestRevision: 2017-07-04
  5. Abstract
  6. ChangeRecord
  7. Contents
      1. 1 Introduction 1
      2. 2 RolesandReporting 4
      3. 3 DMVerificationApproach 5
      4. 4 Pass/FailCriteria 10
      5. 5 ConstraintsandLimitations 10
      6. 6 MasterSchedule 13
      7. 7 VerificationTests 15
      8. 8 SoftwareTools 24
      9. 9 OperationsValidation 26
      10. 10ScienceValidation 27
      11. A VerificationMatrix 31
  8. DataManagementTestPlan
  9. 1 Introduction
      1. 1.1 Objectives
      2. 1.2 Scope
      3. 1.3 Assumptions
      4. 1.4 ApplicableDocuments
      5. 1.5 References
      6. 1.6 Definitions, Acronyms, and Abbreviations
      7. Acronym Description
  10. 2 RolesandReporting
  11. 3 DMVerificationApproach
      1. 3.1 Reports
      2. 3.2 ComponentsUnderTest
      3. 3.3 Testing Specification Document Format
  12. 4 Pass/FailCriteria
  13. 5 ConstraintsandLimitations
      1. 5.1 Procedural and Technical Limitations
      2. 5.2 Requirements Traceability Constraints
      3. 5.2.1 Scientific
      4. 5.2.2 Computational
      5. 5.2.3 KPMs
      6. 5.3 Interfaces
  14. 6 Master Schedule
      1. ID Date/Freq Location Title,Description
  15. 7 VerificationTests
      1. 7.1 SciencePlatformwithWISEdatainPDAC(LDM-503-1)
      2. 7.2 HSC eprocessing(LDM-503-2)
      3. 7.2.1 Personnel
      4. 7.2.2 Open issues
      5. 7.2.3 Datasets
      6. 7.2.4 Calibration Products Production
      7. 7.3 Alert generation validation (LDM-503-3)
      8. 7.4 AuxTelDAQintegrationfunctionaltytest(LDM-503-4)
      9. 7.6 Alert distribution validation (LDM-503-5)
      10. 7.7 DM ComCam interface verification readiness (LDM-503-6)
      11. 7.8 Cameradataprocessing(LDM-503-7)
      12. 7.9 Spectrograph data acquisition (LDM-503-8)
      13. 7.10 Verificationtestsinadvanceofpre-ops ehearsalforcommissioning#1(LDM-503-9)
      14. 7.11 DAQvalidation(LDM-503-10)
      15. 7.12 DM ComCam operations readiness (LDM-503-11a)
      16. 7.13 Verificationtestsinadvanceofpre-ops ehearsalforcommissioning#2(LDM-503-11)
      17. 7.14 Verificationtestsinadvanceofpre-ops ehearsalforcommissioning#3(LDM-503-12)
      18. 7.15 Ops ehearsalDRP(ComCamdata)(LDM-503-13)
      19. 7.16 DM Software for Science Verification (LDM-503-14)
      20. 7.17 Ops ehearsalDRP(SVdata)(LDM-503-15)
      21. 7.18 Verificationtestsinadvanceoffullscaleops ehearsal#1(LDM-503-16)
      22. 7.19 Verificationtestsinadvanceoffullscaleops ehearsal#2(LDM-503-17)
  16. 8 Software Tools
      1. 8.1 Continuous IntegrationandUnitTesting
      2. 8.2 Code Reviews
      3. 8.3 Automated Requirements Verification and KPM Measurement
  17. 9 OperationsValidation
      1. Date/Freq Location Title, Description
      2. data).
  18. 10 ScienceValidation
      1. 10.1 Definition
      2. 10.2 ScheduleandExecution
      3. 10.2.1 Schedule
      4. 10.2.2 Execution
      5. 10.3 Deliverables
      6. 10.4 OrganizationandResources
      7. 10.4.1 Example
  19. A VerificationMatrix

LargeSynopticSurveyTelescope(LSST)
DataManagementTestPlan

Back to top


WilliamO’Mullane,MarioJuric,FrossieEconomou

Back to top


LDM-503

Back to top


LatestRevision: 2017-07-04
ThisLSSTdocumenthasbeenapprovedasaContent-ControlledDocumentbytheLSSTDMTech-
nical Control Team. If this document is changed or superseded, the new document will retain
the Handle designation shown above. The control is on the most recent digital document with
thisHandleintheLSSTdigitalarchiveandnotprintedversions.Additionalinformationmaybe
found in the corresponding DM RFC.

Back to top


Abstract
ThisistheTestPlanforDataManagement. Initwedefinetermsassociatedwith
testingandfurthertestspecificationsforspecificitems.
LARGESYNOPTICSURVEYTELESCOPE

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04

Back to top


ChangeRecord
Version Date
Description
Ownername
2017-01-13 Firstdraft
WilliamO’Mullane
1.0 2017-06-30 Firstapprovedrelease.
WilliamO’Mullane
1.1 2017-07-04 Minor cleanups for review. Approved in
RFC-
358.
W.O’Mullane
Document source location:
https://github.com/lsst/LDM-503
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
ii

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04

Back to top


Contents
1 Introduction
1
1.1 Objectives ......................................... 1
1.2 Scope............................................ 1
1.3 Assumptions........................................ 1
1.4 ApplicableDocuments .................................. 2
1.5 References ......................................... 2
1.6 Definitions,Acronyms,andAbbreviations ...................... 3
2 RolesandReporting
4
3 DMVerificationApproach
5
3.1 Reports........................................... 5
3.2 ComponentsUnderTest ................................. 5
3.3 TestingSpecificationDocumentFormat ........................ 9
4 Pass/FailCriteria
10
5 ConstraintsandLimitations
10
5.1 ProceduralandTechnicalLimitations ......................... 11
5.2 Requirements Traceability Constraints ......................... 11
5.2.1 Scientific ...................................... 11
5.2.2 Computational .................................. 11
5.2.3 KPMs ........................................ 12
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
iii

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
5.3 Interfaces.......................................... 12
6 MasterSchedule
13
7 VerificationTests
15
7.1 SciencePlatformwithWISEdatainPDAC
(LDM-503-1)
............... 15
7.2 HSC eprocessing
(LDM-503-2)
............................. 17
7.2.1 Personnel ..................................... 17
7.2.2 Openissues .................................... 17
7.2.3 Datasets ...................................... 17
7.2.4 CalibrationProductsProduction ........................ 19
7.3 Alertgenerationvalidation
(LDM-503-3)
....................... 20
7.4 AuxTelDAQintegrationfunctionaltytest
(LDM-503-4)
............... 21
7.5 Test Report: Aux Tel DAQ interface Integration Verification and Spectrograph
OperationsRehearsal
(LDM-503-4b)
......................... 21
7.6 Alertdistributionvalidation
(LDM-503-5)
....................... 21
7.7 DMComCaminterfaceverificationreadiness
(LDM-503-6)
............. 21
7.8 Cameradataprocessing
(LDM-503-7)
......................... 21
7.9 Spectrographdataacquisition
(LDM-503-8)
..................... 22
7.10 Verification tests in advance of pre-ops ehearsal for commissioning #1
(LDM-
503-9)
............................................ 22
7.11DAQvalidation
(LDM-503-10)
.............................. 22
7.12DMComCamoperationsreadiness
(LDM-503-11a)
................. 22
7.13 Verification tests in advance of pre-ops ehearsal for commissioning #2
(LDM-
503-11)
........................................... 22
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
iv

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
7.14 Verification tests in advance of pre-ops ehearsal for commissioning #3
(LDM-
503-12)
........................................... 23
7.15Ops ehearsalDRP(ComCamdata)
(LDM-503-13)
.................. 23
7.16DMSoftwareforScienceVerification
(LDM-503-14)
................. 23
7.17Ops ehearsalDRP(SVdata)
(LDM-503-15)
...................... 23
7.18 Verificationtestsinadvanceoffullscaleops ehearsal#1
(LDM-503-16)
.... 23
7.19 Verificationtestsinadvanceoffullscaleops ehearsal#2
(LDM-503-17)
.... 24
8 SoftwareTools
24
8.1 Continuous IntegrationandUnitTesting ....................... 24
8.2 CodeReviews ....................................... 25
8.3 Automated Requirements Verification and KPM Measurement .......... 25
9 OperationsValidation
26
10ScienceValidation
27
10.1Definition.......................................... 27
10.2 Schedule and Execution ................................. 27
10.2.1 Schedule ...................................... 27
10.2.2Execution ..................................... 28
10.3 Deliverables ........................................ 28
10.4OrganizationandResources .............................. 29
10.4.1 Example ...................................... 30
A VerificationMatrix
31
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
v

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04

Back to top


DataManagementTestPlan

Back to top


1 Introduction
In this document we lay out the verification and validation approach for LSST Data Manage-
ment. In addition we outline some of the high level test milestones in Section 6 and our
plannedschedulefordemonstratinginterimverificationstatus.
1.1 Objectives
We describe the test and verification approach for DM and describe various constraints and
limitationsinthetestingtobeperformed. Wealsodescribethevalidationteststobeper-
formedonthepartiallyandfullyintegratedsystem. Wedonotdescribealltestsindetail;
those are described in dedicated test specifications for major components of Data Manage-
ment. Hereweoutlinethe equiredelementsforthosespecificationsaswellasthetoolswe
usetoforcontinuousverification.
1.2 Scope
ThisprovidestheapproachandplanforallofDataManagement. Itcoversinterfacesbetween
DataManagementandcomponentsfromotherLSSTsubsystemsbutnothingoutsideofData
Management. This document is change-controlled by the DMCCB and will be updated in
response to any equirements updates or changes of approach.
1.3 Assumptions
We will run large scale Science Validations in order to demonstrate the system’s end-to-end
capabilityagainstitsdesignspecifications.Alargeamountofinformalsciencevalidationwill
be done in the the teams and documented in technical notes; in this test plan we are look-
ing for validation of the broader system and specifically
operability
i.e. whether we can run
this system every day for the 10 year planned survey with a reasonable level of operational
support.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
1

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
1.4 ApplicableDocuments
Whenapplicabledocumentschangeachangemaybe equiredinthisdocument.
LPM-55 LSSTQualityAssurancePlan
LDM-294 DMProjectManagementPlan
LDM-148 DMArchitecture
1.5 References
[1]
[LSE-29]
,Claver,C.F.,TheLSSTSystemsEngineering IntegratedProjectTeam,2016,
LSST
System Requirements
, LSE-29, URL
https://ls.st/LSE-29
[2]
[LSE-30]
,Claver,C.F.,TheLSSTSystemsEngineering IntegratedProjectTeam,2016,
LSST
System Requirements
, LSE-30, URL
https://ls.st/LSE-30
[3]
[LSE-81]
, Dubois-Felsmann, G., 2013,
LSST Science and Project Sizing Inputs
, LSE-81, URL
https://ls.st/LSE-81
[4]
[LSE-61]
, Dubois-Felsmann, G., 2016,
LSST Data Management Subsystem Requirements
,
LSE-61, URL
https://ls.st/LSE-61
[5]
[LSE-82]
, Dubois-Felsmann, G., Lim, K.T., 2013,
Science and Project Sizing Inputs Explana-
tion
, LSE-82, URL
https://ls.st/LSE-82
[6]
[LPM-17]
, Ivezić, Ž., The LSST Science Collaboration, 2011,
LSST Science Requirements
Document
, LPM-17, URL
https://ls.st/LPM-17
[7]
[LSE-163]
, Jurić, M., etal., 2017,
LSSTDataProductsDefinitionDocument
, LSE-163, URL
https://ls.st/LSE-163
[8]
[LDM-240]
, Kantor, J., Jurić, M., Lim, K.T., 2016,
Data Management Releases
, LDM-240,
URL
https://ls.st/LDM-240
[9]
[LDM-148]
, Lim, K.T., Bosch, J., Dubois-Felsmann, G., et al., 2017,
Data Management Sys-
tem Design
, LDM-148, URL
https://ls.st/LDM-148
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
2

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
[10]
[LDM-294]
, O’Mullane, W., Swinbank, J., Jurić, M., DMLT, 2017,
Data Management Organi-
zation and Management
, LDM-294, URL
https://ls.st/LDM-294
[11]
[LPM-55]
, Sweeney, D., McKercher, R., 2013,
Project Quality Assurance Plan
, LPM-55, URL
https://ls.st/LPM-55
[12]
[LSE-63]
, Tyson, T., DQA Team, Science Collaboration, 2017,
Data quality Assurance Plan:
Requirements for the LSST Data Quality Assessment Framework
, LSE-63, URL
https://ls.
st/LSE-63
1.6 Definitions, Acronyms, and Abbreviations
Acronym Description
DAX
DataAccessServices
DBB DataBackBone
DM
DataManagement
DMCCB DMChangeControlBoard
DRP DataReleaseProduction
EFD
EngineeringFacilitiesDatabase
HSC
HyperSuprime-Cam
ICD
InterfaceControlDocument
JIRA issue tracking product (not an acronym, but a truncation of Gojira, the
JapanesenameforGodzilla)
KPM KeyPerformanceMetric
LSST LargeSynopticSurveyTelescope
NCSA NationalCenterforSupercomputingApplications
OCS
ObservatoryControlSystem
OPS
OPerationS
QA
QualityAssurance
QC
QualityControl
Qserv ProprietaryLSSTDatabasesystem
SPR
SoftwareProblemReport
SQuaRE ScienceQualityandReliabilityEngineering
SV
ScienceValidation
TBD ToBeDefined(Determined)
UX
Userinterfacewidget
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
3

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
VCD
VerificationControlDocument

Back to top


2 RolesandReporting
Each test specification must make clear who the
tester
is.
Testers eport issues (SPRs) through the Data Management ticketing system (i.e. JIRA at the
timeofthisdocumentrevision)andalsowriteatest eport(and/orprovideanynecessary
configuration for automatic eport generation).
Thetest eportswillbeusedtopopulatetheverificationcontroldocument(seeSection3). We
are monitoring the LSST Systems Engineer’s approach to plan commissioning tests for LSST
system-wide verification and will evaluate the merits of using the same toolchain for Data
Managementverification.
Operations ehearsals equirean
ops ehearsalcoordinator
tooverseetheprocess. Thisisa
distinctrolefromthatofthetester. Forexample,the ehearsalmaynotbedirectedbythe
OperationsManager,sincethatpersonhasamajorroleinthe ehearsal. Anindividualnot
involved in the ehearsal itself will be identified to perform this function.
Tests and procedures will sometimes fail: a test specification may be e-run several times
until it passes, but the eport must include an explanation than indicates that any failures
were understood (e.g. they were due to a fault that was fixed) or repeated sufficient times to
ensurethatpassingthetestwasnottransientsuccess.
Forlargescaletestsand ehearsalstheDMCCB,oranindividualdesignatedbyit, willbe
taskedtowriteupthefindingsaswellasdecideontimescalesfor e-runningpartorallofa
test in case of failure or partial success.
Other parties that have a relevant role in Data Management verification are identified in the
appropriate sections of the document; these are involved in their primary capacity (e.g. the
DM Systems Engineer) and so are not individually listed in this section.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
4

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04

Back to top


3 DMVerificationApproach
Our approach towards verifying the Data Management equirements follows standard engi-
neering practice. Each high level component will have at least one test specification defining
asetoftestsrelatedtothedesign equirementsforthecomponent. Thesespecificationsare
epresentedonthetopofFigure1. Anygiven equirementmayhaveseveraltestsassociated
with it in the specification; these tests may be phased to account for incremental delivery
depending on the need for certain functionality at a specific time.
ThetestspecwillcoverallaspectsofthetestasoutlinedinSection3.3. Thesehighlevel
testspecificationsmaycalloutindividuallowerleveltestspecificationswhereitmakessense
(eithertechnicallyorprogrammatically)totestlower-levelcomponentsinisolation.
3.1 Reports
Asweexecutetestswewillgeneratetest eportsonthepass/failstatusoftheindividual
tests related to specific equirements. This information will allow us to build a Verification
Control Document (VCD) (shown at the right of Figure 1). The VCD will provide the fractional
verificationstatusofeachDM equirement. Thesewillalsoberolleduptothe(higher)level
of OSS (Observatory System Specifications; LSE-30) equirements. Figure 1 currently calls for
a eport from each test spec. This eport may be captured directly in e.g. JIRA: it does not
necessarilycorrespondtoaseparate(e.g.WordorLaTeX)document.
Incasesof eportsthataregeneratedviaautomatic(continuous)verification,the eportmay
beintheformatofaJupyterNotebookthatsimultaneouslycanserveastestspecificationand
test eportand,insomecases,thetestscriptitself. Thisisthepreferredmethod,provided
thenotebook-as-reportissatisfactorilycapturedinDocuShare.
3.2 ComponentsUnderTest
The components of the DM ystem are outlined in LDM-294 and detailed in LDM-148. At a high
levelthesecomponentsare epresentedinFigure2. Basedonthosecomponentswecansee
the set of Test Specifications needed in Table 1. At time of writing, document numbers are
notavailableforallsecond-levelcomponents.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
5

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
Spec
Observatory
LSE-30 (OSS)
System
DM Data Acq
ICD LSE-68
ICD
DM
LSE-69
Camera
ICD
Control
LSE-75
DM Telescope
Sys
DM Summit Infra
ICD LSE-76
DM Base Infra
ICD LSE-77DM
EPO ICD
LSE-131
DM Telescope
Aux ICD LSE-140
LSE-29
Requirements
LSST
(LSR)
System
Requirements
LSST Science
LPM-17 (SRD)
Test Planning
and
Specification
V alidation and Design
Coming in 2018
Document
Interface
(ICD)
Control
Needs Update
DM System
Requirements
LSE-61 (DMSR)
LSST Data Quality
Assurance Plan
LSE-63 (DQAP)
LSST Data Products
LSE-163 (DPDD)
DM V alidation
& Test Plan
LDM-503 (SVTP)
DM PMP LDM-294
Config/Release/Deploy
Management
Control
DM
(VCD)
erification
Component Archi-
tecture LDM-148
Science Platform
Requirements
LDM-554
Middleware
Requirements
LDM-556
Database Require-
ments LDM-555
Design
Science
LDM-542
Platform
Middleware
Design LDM-152
Database De-
sign LDM-135
tructure
Services
LDM-129
& Infras-
L2 Pipeline
Design LDM-151
sign
Network
LSE-78
De-
User Documentation
NCSA Enclave Test
Spec LDM-532
Base Enclave Test
Spec LDM-538
Comm Cluster Test
Spec LDM-541
Data BackBone
Test Spec LDM-535
Data Services Test
Spec LDM-536
Science Platform
Test Spec LDM-540
L1 Test Spec
LDM-533
L2 Test Spec
LDM-534
DBB Infrastructure
Test Spec LDM-537
L2 KPMs LDM-502
Qserv test spec
LDM-552
L2 Test Reports
NCSA Enclave
Test Reports
Test
Base
Reports
Enclave
Comm Cluster
Test Reports
DBB Test Reports
Data Services
Test Reports
Science Platform
Test Reports
L1 Test Reports
Test
Infrastructure
Reports
1
FIGURE 1: Documentation tree for DM software relating the high level documents to each
other.(fromLDM-294)
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
6

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
Data Backbone
NCSA Enclave
Analysis and
Developer Support
Level 1
Level 2
Base Enclave
Prompt
Processing
Ingest
Offline
Processing
OCS
Driven
Batch Ctrl
Image and
EFD
Archiving
Level 1
Quality
Control
Alert
Distribution
Telemetry
Gateway
Alert
Filtering
Template &
Calib. Prod.
Production
Data
Release
Production
Level 2
Quality
Control
US Data Access Center
Bulk Data
Distribution
Science
Platform
(DAC)
Observatory
Developer
Services
Integration
& Test
Science
Platform
(Sci. Valid.)
Data
Backbone
Endpoint
Prompt
Processing
Data
Backbone
Endpoint
OCS Batch
Processing
Data
Backbone
Endpoint
Commissioning Cluster
Science
Platform
(Commiss.)
Data
Backbone
Endpoint
RabbitMQ
BBFTP
HTCondor
Satellite Processing CC-IN2P3
DRP
Satellite
Processing
Pegasus /
HTCondor
Tape
Periodic
Calibration
Payload
Template
Generation
Payload
Raw Calib
Validation
Payload
Alert
Production
Payload
Annual
Calibration
Payload
DRP
Payload
MOPS
Payload
Daily Cal.
Update
Payload
Chilean Data Access Center
Science
Platform
(DAC)
Data
Backbone
Endpoint
Science
Users
Staff
Staff
Alert
Users
Community
Alert
Brokers
EPO
Other Data
Partners
Data
Backbone
Endpoint
FIGURE 2: DM components as deployed during Operations. Where components are de-
ployed in multiple locations, the connections between them are labeled with the relevant
communication protocols. Science payloads are shown in blue. For details, efer to LDM-
148.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
7

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
TABLE 1: Components from LDM-148 with the test specifications to verify them.
Component
TestingSpec
NCSAEnclave
LDM-532
-L1System
LDM-533
--L1PromptProcessing
TBD
--L1AlertDistribution
TBD
--L1AlertFiltering(miniBroker) TBD
--L1QualityControl
TBD
--L1OCSBatchProcessing
TBD
--L1OfflineProcessing
TBD
-L2System
LDM-534
--L2QC
LDM-534
--L2DataRelease
LDM-534
--L2CalibrationProducts
LDM-534
DataBackbone
LDM-535
-DBBDataServices
LDM-536
--DBBQserv
LDM-552
--DBBDatabases
TBD
- - DBB Image Database/Metadata
Prov
TBD
--DBBDataButlerClient
TBD
-DBBinfrastructure
LDM-537
--DBBTapeArchive
TBD
--DBBCache
TBD
--DBBDataEndpoint
TBD
--DBBDataTansport
TBD
--Networks
TBD
BaseEnclave
LDM-538
--PromptProcessing Ingest
TBD
--TelemetryGateway
TBD
-- ImageandEFDArchiving
TBD
--OCSDrivenBatchControl
TBD
DataAccessCenterEnclave
LDM-539
--BulkDataDistribution
TBD
--SciencePlatform
LDM-540
--SciencePlatformJupyterLab
TBD
--SciencePlatformPortal
TBD
--DAXVO+Services
TBD
CommissioningClusterEnclave
LDM-541
--SuperTask
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
8

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
The test items covered in this test plan are:
• Data Management and its primary components for testing and integration purposes.
These are listed in Table 1. All components listed in orange and yellow have specifica-
tionsinthecorrespondingdocumentslisted.Majorsub-componentsinwhitemayhave
individualtestspecificationsorbeaddressedinthecomponenttheyareunderdepend-
ing on applicable factors such as whether they are scheduled for testing at the same
timeand/orwhethertheysharearchitecturalcomponentsorarelargelydistinct.
• The external interfaces between Data Management and other sub-systems. These are
describedinDocuSharecollection5201.
• Operational procedures like Data Release Process, the Software Release Process and
theSecurityPlan.
3.3 Testing Specification Document Format
The testing specification documents will be drawn up in conjunction with the LSST Systems
Engineer. In all cases they will include:
• A list of components being tested within the scope of the test specification document.
• A list of features in those components that are being explicitly tested.
• The relationship between features under test and the identified equirements for the
component.
• A description of the environment in which the tests are carried out (e.g. hardware plat-
form) and a description of how they differ from the operational system in tests prior to
finalintegration(e.g.interfacesthatmaybemockedwithoutaffectingthatcomponent’s
testing).
• Theinputs(suchasdata,API load,etc.) thataretobeusedinthetest.
• Pass-fail criteria on any metrics or other measurements.
• How any outputs that are used to determine pass/fail (e.g. data or metrics) are to be
publishedorotherwisemadeavailable.
• A software quality assurance manifest, listing (as relevant) code epositories, configura-
tion information, elease/distribution methods and applicable documentation (such as
installationinstructions,developerguide,userguideetc.)
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
9

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04

Back to top


4 Pass/FailCriteria
A test case will be considered “passed” when:
• AlloftheteststepsoftheTestCasearecompletedand
• All open SPRs from this Test Case are considered noncritical by DMCCB.
Atestcasewillbeconsidered“PartiallyPassed”when:
• OnlyasubsetofalloftheteststepsintheTestCasearecompletedand/orthereremain
openSPRswhichare egardedascriticalbytheDMCCB,but
• TheDMCCB egardsoverallpurposeofthetestashavingbeenmet.
A test case will be considered “Failed” when:
• OnlyasubsetofalloftheteststepsintheTestCasearecompletedand/orthereremain
openSPRswhichare egardedascriticalbytheDMCCB,and
• TheDMCCB egardsoverallpurposeofthetestasnothavingbeenmet.
Note that in LPM-17 science equirements are described as having a minimum specification, a
designspecificationandastretchgoal.Wepreservethesedistinctionswheretheyhavebeen
made in, for example, the verification framework and automated metric harness. However
for the purposes of pass/fail criteria, it is the design specification that is verified as having
been met for a test to pass without intervention of the DMCCB.
Ultimately, if it proves impossible to satisfy a equirement at design specification, LSST Project
level approval is equired to accept the minimum specification.

Back to top


5 ConstraintsandLimitations
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
10

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
5.1 Procedural and Technical Limitations
• Verification is beingdone on the basis of precursordata sets such as HSC(see for exam-
pleSection7.2),andeventuallywithengineeringdatafromtheLSSTcamerateststands
andcommissioningcamera.Thesearejustaproxyforfull-focal-planeon-siteLSSTdata.
• Metric measurements and operational ehearsals during construction may not involve
critical operational systems that are still in development. For example, while compu-
tationalperformanceisbeingmeasured,computationallydominantalgorithmicsteps
such as deblending and multi-epoch fitting may only be modeled, since they have not
yet been implemented; operational ehearsals are done without the factory LSST work-
flowsystem;etc.
5.2 Requirements Traceability Constraints
Thissectionoutlinesthetraceabilityof equirementsthroughkeyLSSTandDataManagement
documentation. InprincipleallDM equirementsshouldbefloweddowntoLSE-61(theDM
SystemRequirements,or
DMSR
). In practice, while we are working to make that the reality,
thecurrentsituationisoutlinedhere.
5.2.1 Scientific
Some scientific equirements are captured in LSE-29 (the LSST System Requirements, or
LSR
)
and flow down to LSE-30 (the Observatory System Specifications, or
OSS
). Work remains to
flow them down from there to LSE-61.
Some equirements alre also specificed in LSE-163 (the Data Products Definition Document,
or
DPDD
) and will flow down from there to LSE-61.
5.2.2 Computational
Thereare equirementsinLSE-61(aka
DMSR
) which captures the LSE-30
(
OSS
) equirements
that DM is responsible for. These are:
• The primary computational performance flown down from LSE-29
(
LSR
)isOTT1which
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
11

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
isthe equirementtoissueanalertwithin60secondsofexposureend.
DMS-REQ-0004
LSR-REQ-0101
• Another equirementflowndownfromLSE-29iscalculationoforbitswithin24hoursof
theendoftheobservingnight.
DMS-REQ-0004
LSR-REQ-0104
L1PublicT
• Thereisanew(notyetbaselined?) equirementforthecalibrationpipelinetoreduce
calibration observations within 1200 seconds.
calProcTime
• Anightly eportondataquality,datamanagementsystemperformanceandacalibra-
tion eporthavetobegeneratedwith4hoursoftheendofthenight.
DMS-REQ-0096
dqReportComplTime
Work remains to flow down LSE-63, the Data Quality Assurance Plan, to LSE-61.
Note that there are no computational equirements on individual technical components such
as data processing cluster availability, database data etrieval speeds, etc. There is, however,
an upper limit on acceptable data loss, and there is a network availability equirement.
5.2.3 KPMs
As a proxy for validating the DM system, LDM-240, the—now obsolete—DM release plan, de-
fined a set of Key Performance Metrics that the system could be verified against. KPMs were
not formally flowed down from LSE-29 through LSE-30, although there is some overlap with
LSE-29 equirements. In particular, the non-science KPMs only exist in LDM-240, although
they are implicitly assumed in the sizing model presented in LSE-81 and LSE-82. Although
othermaterialinLDM-240isnow egardedasobsolete,theseKPMsarestillbeingtracked.
5.3 Interfaces
Wewillverifyexternalinterfacestoothersubsystemsandselectedmajorinternalinterfaces.
The I
CDsdescribingexternalinterfacesarecuratedinDocuShareCollection5201.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
12

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04

Back to top


6 Master Schedule
The schedule for testing the system until operations commence (currently 2022) is outlined
in Table 3. These tests mark the major
1
milestones of the DM project. They are closely tied to
major integration events for the overall LSST system, as shown in Figure 3.
Table 3: List of High Level integration tests for DM
ID
Date/Freq Location Title,Description
LDM-503-NLY Nightly Amazon
NightlyTests
Run all automated tests on all DM packages automati-
cally.
LDM-503-WLY Weekly Amazon
Integrationtests
Basic Sanity check to make sure code compiles at no re-
gressions have occurred and also pushing though a ba-
sic data set.
LDM-503- TBD
NCSA Interfacetests
Theinterfacetestshavetobeplannedanddocumented
in a separate test plan that should include tests for each
two parties on an interface (2by2 tests) as well as tests
forallparties. Someofthesewillbecoveredagainin
E2E tests but before that we should be confident they
work.
Thisincludesinternalandexternalinterfaces.
LDM-503- TBD
NCSA +
IN2P3
End to End Tests ?? Freeze software for Ops
..
https://confluence.lsstcorp.org/display/DM/Data+
Processing+End+to+End+Testing
What is the status of
these ?
LDM-503-1 2017-11-30 NCSA
SciencePlatformwithWISEdatainPDAC
SUITcontin-
ues PDAC development, adding the WISE data, further
exercising the DAX dbserv and imgserv APIs, and taking
advantageofmetaservonceitbecomesavailable
1
Level 2, in the parlance of LDM-294.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
13

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
LDM-503-2 2017-11-30 NCSA
HSC eprocessing
Validate the data products with the
LSSTstackmatchorimproveuponHSCproducts. Val-
idate the ops platform in NCSA, including installing the
stack,startingandstoppingproduction.Generateaval-
idationdatasetforweeklyintegrationandothertests.
LDM-503-3 2017-11-30 NCSA
Alert generation validation
Validate the alert gener-
ation stack performance on several DECam and HSC
datasets.
LDM-503-4 2018-02-01 NCSA
Aux Tel DAQ integration functionalty test
The pro-
duction Aux Tel data acquisition hardware should be
available in Tucson in 2018-02. We should prepare by
testingtheadjacentarchivesystems.
LDM-503-4b 2018-02-12 NCSA
Test Report: Aux Tel DAQ interface Integration Veri-
ficationandSpectrographOperationsRehearsal
The
productionAuxTeldataacquisitionhardwareshouldbe
available in Tucson in 2018-02. We should test integra-
tionwiththeadjacentarchivesystems.
LDM-503-5 2018-05-31 NCSA
Alertdistributionvalidation
Validatealertdistribution
system and mini-broker fed by live or simulated live
data.
LDM-503-6 2018-06-30 NCSA
DM ComCam interface verification readiness
Com-
CamwillbeinTucsonon2018-07-24. TheDMsystem
mustbereadytodealwithit.
LDM-503-7 2018-08-31 NCSA
Cameradataprocessing
Partialcameradatashouldbe
availabletoDMJuly31st. WeplantotestDMstackwith
it.
LDM-503-8 2018-11-30 NCSA
Spectrograph data acquisition
Demonstrate that we
canacquire(andprocess?)datafromthespectrograph.
LDM-503-9 2018-11-30 NCSA
Verificationtestsinadvanceofpre-ops ehearsalfor
commissioning #1
Test how the system will run during
commissioning. Chuck requests that this initial test fo-
cuson ISR.
LDM-503-10 2019-02-28 NCSA
DAQ validation
There is a project Milestone that
DAQ/DM/Networks are available March 15th. We need
toruntestsinFebtoshowthisisready.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
14

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
LDM-503-11a 2019-10-20 NCSA
DMComCamoperationsreadiness
ComCamwillbein
useinNov. TheDMsystemmustbereadytodealwith
it.
LDM-503-11 2019-10-31 NCSA
Verificationtestsinadvanceofpre-ops ehearsalfor
commissioning#2
Morecompletecommissioningre-
hearsal: how do the scientists look at data? How do
they provide feedback to the telescope? How do we cre-
ate/updatecalibrations? Exercisecontrolloops.
LDM-503-12 2020-01-31 NCSA
Verification tests in advance of pre-ops ehearsal
for commissioning #3
Dress ehearsal: commissioning
startsinApril,sobythisstageweshouldbereadytodo
everythingneeded.
LDM-503-13 2020-11-30 NCSA
Ops ehearsalDRP(ComCamdata)
ComCamdatawill
nowbeavailable. Demonstrateitsuseinproducinga
datarelease.
LDM-503-14 2021-03-31 NCSA
DMSoftwareforScienceVerification
ScienceVerifica-
tionstartsinApril. Demonstratethatall equiredDM
softwareisavailable.
LDM-503-15 2021-11-30 NCSA
Ops ehearsal DRP (SV data)
Science Verification data
will now be available. Demonstate its use in producting
adatarelease.
LDM-503-16 2022-02-28 NCSA
Verification tests in advance of full scale ops re-
hearsal#1
Testreadinessforoperations.
LDM-503-17 2022-09-30 NCSA
Verification tests in advance of full scale ops re-
hearsal#2
Confirmreadinessforoperations.

Back to top


7 VerificationTests
7.1 SciencePlatformwithWISEdatainPDAC(LDM-503-1)
SUITcontinuesPDACdevelopment,addingtheWISEdata,furtherexercisingtheDAXdbserv
and imgserv APIs, and taking advantage of metaserv once it becomes available
From DAX: need to be clear about which WISE datasets are to be loaded – the data wrangling
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
15

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
FIGURE 3: DM major milestones (LDM-503-x) in the LSST schedule.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
16

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
effort equired to download, inspect, convert, partition, and load each additional dataset is
cumulativelynon-trivialforDAX
7.2 HSC eprocessing(LDM-503-2)
7.2.1 Personnel
JimBosch,RobertLupton,JohnSwinbank,Hsin-FangChiang.
7.2.2 Open issues
• Check that data products generated with the LSST stack match or improve upon the
equivalentHSCproducts.
• Validate the ops platform in NCSA, including installing the stack, starting and stopping
production.
• Generate a validation data set for weekly integration and other tests.
From the pipelines perspective, there’s no new work involved here beyond the v13.0 release
(at which point the HSC merge is complete and QA has been performed). Suggest we’d run
thiswiththelatestreleaseasofthedateofthetest(sothisis14.N,where14.0istheend-of-
S17 release). Again from pipelines, detailed definition of the “ops platform” is not necessary.
Suggestthattheplausibleavailabilityofservicesshoulddrivethetestplaninthiscase, ather
thanviceversa.
7.2.3 Datasets
During F17, we expect to continue testing and validation of Data Release Production algo-
rithmsprimarily byrepeated eprocessing ofthefirstHSC PublicDataRelease (PDR1)on the
LSSTVerificationCluster(VC).
Weexpecttoperformprocessingatthreedifferentscales:
• ThefullPDR1dataset;
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
17

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
• A substantial fraction (nominally 10% of PDR1);
• The HSC “RC” dataset (a subset of PDR1
2
pre-selected for pipeline release testing).
ThefullPDR1datasetconsistsof6202exposures,or17TBofrawdata. Itisnowavailable
in the /datasets/ filesystem on the VC (see RFC-297, DM-9683). One complete eprocessing
ofPDR1 equiresaround200TBofstorage(seeDM-8143);wethereforeassumethat10%of
PDR1 equiresaround20TB;weexpect eprocessingtheRCdatasettoconsumearound7
TB.
AgainfollowingDM-8143,weexpectonecompletereductionofPDR1toconsumearound750
core-weeks of CPU time (and, similarly, 75 core-weeks for a 10% fraction, or 25 core-weeks
for the RC dataset). Note that:
• AsofApril2017thereare1152coresintheVC,sowemightreasonablyexpectthatthe
entire data release can processed in about 5 days.
• This assumes minimal inefficiency due to workflow; we expect wall-clock time to be
atherhigher.
7.2.3.1 Automated Processing
We expect that some processing takes place automati-
cally, without intervention or explicit request from the DRP team. In each case, processing
makes use of the latest weekly release of the LSST stack, with the default configuration; in
special circumstances, the DRP team may request an alternative version and/or configura-
tionbeforetheprocessingrunstarts.
The pipeline logic will be provided by the DRP team in whatever the currently-accepted stan-
dard for LSST data processing is. That is, we expect to use pipe_drivers/ctrl_pool style dis-
tribution middleware until the point at which a new solution, e.g. one based on SuperTask
and Pegasus, becomes available. At that point, the DRP team is responsible for porting their
pipelinestothenewsystem.
We expect expect that egular execution of the relevant pipelines and checking for success-
ful execution will take place outside the scope of DRP. We expect that failures at the execu-
2I
n fact, the existing RC dataset is not, in fact, all public. However, it should be straightforward to define a new
RC-sizeddatasetwhichis.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
18

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
tion middleware, hardware or networking layer will be resolved without the need for explicit
pipelines intervention. We expect the DRP team to be responsible for triaging and resolving
failuresinpipelinelogic,configuration,etc.
In the below, we suggest a calendar-based processing scheme. In practice, one which is tied
to specific stack releases, ather than to the date, is likely preferable. However, implementing
such a scheme would equire rethinking the stack release procedure.
7.2.3.1.1 PDR1
To be eprocessed every two months. The results of the last three jobs
should be retained: in the steady state this will consume ˘600 TB of storage.
7.2.3.1.2 RCDataset
Tobe eprocessedweekly.Theresultsofthelastfourjobsshouldbe
retained: in the steady state this will consume ~28 TB of storage.
7.2.3.2 ManualProcessing
Werequestamechanismbywhichdevelopersmaymanually
trigger processing jobs which will address broadly arbitrary arbitrary subsets of HSC PDR1
withuserspecifiedsoftwareversionsandconfigurations,e.g.assuppliedthroughaconfigu-
rationfile(orshellscript,etc).
AlthoughDRPdeveloperswillbeultimatelyresponsibleforthesuccessfulexecutionofthese
jobs, we request support from NCSA in triaging failures which may be due to cluster or mid-
dlewareissues.
7.2.3.2.1 Storage
Thatthetotalstorage equirementforsuchad-hocjobsduringF17will
amount to no more than 200 TB. We suggest that this be provisioned in /project/, and that it
follow the egular policies which apply to that filesystem.
7.2.3.2.2 Compute
Weexpecttoconsumearound50core-weekspercalendarweekon
ad hoc processing (that is, equivalent to two reductions of the RC dataset per week).
7.2.4 Calibration Products Production
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
19

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
7.2.4.1 Datasets
WeexpectthatdatafrombothTS8(RFC-301)andthe0.9matCTIO(RFC-
313) continue to be egularly made available on the /datasets/ filesystem.
On the timescale of F17, we expect these datasets to total no more than 20 TB.
7.2.4.2 AutomatedProcessing
Wedonotrequestanyautomatedprocessingofdatafor
CalibrationProductsProductingduringF17.
7.2.4.3 Manual Processing
We expect that developers will manually trigger processing
jobs which will address broadly arbitrary subsets of the TS8 & CTIO data with user specified
software versions and configurations, e.g. as supplied through a configuration file (or shell
script,etc).
AlthoughDRPdeveloperswillbeultimatelyresponsibleforthesuccessfulexecutionofthese
jobs, we request support from NCSA in triaging failures which may be due to cluster or mid-
dlewareissues.
7.2.4.3.1 Storage
Thatthetotalstorage equirementforsuchad-hocjobsduringF17will
amount to no more than 50 TB. We suggest that this be provisioned in /project/, and that it
follow the egular policies which apply to that filesystem.
7.2.4.3.2 Compute
Weexpecttoconsumenomorethan25core-weekspercalendarweek
processingthisdata.
7.3 Alert generation validation (LDM-503-3)
ValidatethealertgenerationstackperformanceonseveralDECamandHSCdatasets.
"Stack" is probably ill-defined here — is this simply testing science logic, or are we going after
awiderintegrationexercise?
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
20

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
7.4 AuxTelDAQintegrationfunctionaltytest(LDM-503-4)
The production Aux Tel data acquisition hardware should be available in Tucson in 2018-02.
Weshouldpreparebytestingtheadjacentarchivesystems.
A minimal DM-only system that can archive mocked-up images and demonstrate that they
can be etrieved, with provenance and metadata.
7.5 Test Report: Aux Tel DAQ interface Integration Verification and Spectro-
graphOperationsRehearsal(LDM-503-4b)
The production Aux Tel data acquisition hardware should be available in Tucson in 2018-02.
Weshouldtestintegrationwiththeadjacentarchivesystems.
A minimal system that can archive simulated images from the Aux Tel DAQ and demonstrate
thattheycanbe etrieved.
7.6 Alert distribution validation (LDM-503-5)
Validatealertdistributionsystemandmini-brokerfedbyliveorsimulatedlivedata.
Can we test a SUIT interface to the broker at this point? I believe it’s not scheduled until later
in construction.
7.7 DM ComCam interface verification readiness (LDM-503-6)
ComCam will be in Tucson on 2018-07-24. The DM system must be ready to deal with it.
"TheDMsystem"shouldusesomefurtherdefinition: whatdowewanttotesthere? Data
flow from ComCam to the Data Backbone, or science processing of ComCam data? Note the
LSE-79 equirementsforDMservicesinsupportofComCamintable8. They’re equiredby
Nov 2019/Feb 2020; it may be more appropriate to test some of them at a later date?
7.8 Cameradataprocessing(LDM-503-7)
Partial camera data should be available to DM July 31st. We plan to test DM stack with it.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
21

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
7.9 Spectrograph data acquisition (LDM-503-8)
Demonstratethatwecanacquire(andprocess?)datafromthespectrograph.
PerLSE-79, AuxTeldeliveryinNov2017(ie, ayearbeforethismilestone)includes: ! EFD
ETL service ! Aux Telescope Archiving Service ! Data backbone in support of Aux Telescope
archiving Do we need to schedule another test to cover that?
7.10 Verificationtestsinadvanceofpre-ops ehearsalforcommissioning#1
(LDM-503-9)
Testhowthesystemwillrunduringcommissioning.Chuckrequeststhatthisinitialtestfocus
on ISR.
"Focuson ISR"—weshouldtestwhateverwehaveavailable. SeeLSE-79foralistof equire-
ments.
7.11 DAQvalidation(LDM-503-10)
There is a project Milestone that DAQ/DM/Networks are available March 15th. We need to
runtestsinFebtoshowthisisready.
7.12 DM ComCam operations readiness (LDM-503-11a)
ComCamwillbeinuseinNov. TheDMsystemmustbereadytodealwithit.
"TheDMsystem"shouldusesomefurtherdefinition: whatdowewanttotesthere? Data
flow from ComCam to the Data Backbone, or science processing of ComCam data? Note the
LSE-79 equirementsforDMservicesinsupportofComCamintable8. They’re equiredby
Nov 2019/Feb 2020; it may be more appropriate to test some of them at a later date?
7.13 Verificationtestsinadvanceofpre-ops ehearsalforcommissioning#2
(LDM-503-11)
Morecompletecommissioning ehearsal: howdothescientistslookatdata? Howdothey
provide feedback to the telescope? How do we create/update calibrations? Exercise control
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
22

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
loops.
7.14 Verificationtestsinadvanceofpre-ops ehearsalforcommissioning#3
(LDM-503-12)
Dress ehearsal: commissioningstartsinApril,sobythisstageweshouldbereadytodoev-
erything needed.
7.15 Ops ehearsalDRP(ComCamdata)(LDM-503-13)
ComCam data will now be available. Demonstrate its use in producing a data release.
NotethatLSE-79 equiresasuiteofDMservicesinsupportofthefullcamerainMay2020. It
seemsinappropriatetotestthemaspartofacommissioningops ehearsal,buttheyarewell
before this data. Do we need another test milestone?
7.16 DM Software for Science Verification (LDM-503-14)
Science Verification starts in April. Demonstrate that all equired DM software is available.
SV will includecalculating all KPMs to demonstratethat we are reachingthe science equire-
ments. Thatobviouslymeanswe’llneedcodewhichisbothcapableofreachingthosere-
quirements,andcalculatingtheKPMs.
7.17 Ops ehearsalDRP(SVdata)(LDM-503-15)
Science Verification data will now be available. Demonstate its use in producting a data re-
lease.
7.18 Verificationtestsinadvanceoffullscaleops ehearsal#1(LDM-503-16)
Testreadinessforoperations.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
23

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
7.19 Verificationtestsinadvanceoffullscaleops ehearsal#2(LDM-503-17)
Confirmreadinessforoperations.

Back to top


8 Software Tools
A number of tools and development practices are in use in Data Management to ensure soft-
warequalityandtoverify equirementsaremet. Thesetoolsareusedcontinuously(e.g.to
measurekeyperformancemetricsroutinely)orperiodically(e.g.softwarereleasecharacteri-
zations) and so will be well understood by the time the formal verification phase begins.
8.1 Continuous IntegrationandUnitTesting
Code is checked via a continuous integration (CI) service both for on-demand developer use
and for verifying the quality of the master branch. Irrespective of supported platforms, we
have a practice of verifying that the stack can run on at least two distinct operating systems/-
platformsasportabilityisoftenagoodindicatorofmaintainability.TheCI servicealsopermits
verification that the codebase runs with different third party dependencies; for example we
test that the python code runs both under (legacy) Python 2.7 and Python 3. This reduces the
foreseeabletechnicaldebtofportingtoPython3foroperations.
Unit testing policy is described in the
DM Developer guide
under Unit Test Policy.
Rolesandresponsibilitiesinthisareainclude:
• TheDMSystemsEngineeringTeamteamisresponsibleforapprovingdependenciesand
settingstrategysuchastheapproachtoPython3portability.
• The DM Systems Engineering Team is responsible for setting the unit testing policy.
• The SQuaRE team is responsible for developing, operating and supporting continuous
integrationservices.
• The SQuaRE team determines platform release practice in conjunction with the other
teams,notablyincludingArchitecture.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
24

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
At the time of this revision we do not have unit test coverage tooling for Python. This will be
remediedwithanupcomingswitchtothepytestframework.
8.2 Code Reviews
DM’sprocess equiresthateverystoryresultingincodechangestothestackisreviewedprior
to being merged to master. This is both as code quality verification and also to ensure that
at least one other team-member has some familiarity with a particular part of the codebase.
DM’s Code Review process is described in the
DM Developer guide
under the section DM
CodeReviewandMergingProcess.
Rolesandresponsibilitiesinthisareainclude:
• The DM Systems Engineering Team defines the development process and style guide
includingthecodereviewstandard.
• SQuaRE is responsible for supporting tooling to assist code review (e.g. linters, JIRA-
GitHubintegration,etc).
8.3 Automated Requirements Verification and KPM Measurement
DM uses a harness for continuous metric verification. In the software development context
this is used for:
• CalculatingKPMswhereavailableandalertingwhentheyexceedspecification.
• A egression testing framework for any developer-supplied metric, with optional alerts
when excursions occur from past values to verify that performance is not being de-
gradedbynewcodeorenvironments.
• Visualizing these results and linking them back to build and pull request information.
•Drill-downofthosemetricsinpre-definedvisualizationtemplatesgearedtowardsspe-
cific verification use-cases.
Rolesandresponsibilitiesinthisareainclude:
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
25

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
• The pipeline teams are responsible for providing the code and data to calculate the
KPMs.
• SQuaRE is responsible for developing and operating the continuous metric verification
services.
• Individualdeveloperscontributenon-KPMmetricsasdesired.

Back to top


9 OperationsValidation
OperationsValidationofthesystemisdonethroughOperationsRehearsals(and/orend-to-
end tests). This may repeat some or all of a science validation exercise but in a more opera-
tionalsettingwithafocusonoperations. Theproposed ehearsaldatesarelistedinTable4.
Table4: Operations ehearsalsforOpsvalidationofDM
Date/Freq Location Title, Description
Oct2018 NCSA
Operations ehearsal for commissioning
With TBD weeks
commissioning (lets say a week) – pick which parts of plan
we could ehearse. Chuck suggests Instrument Signal Removal
shouldbethefocusofthis(orthenext ehearsal).
Oct2019 NCSA
Operations ehearsal#2forcommissioning
Morecompletere-
hearsal – where do the scientist look at quality data? How do they
feed it back to the Telescope ? How do we create/update calibra-
tions ? Exercises some of the control loops.
Jan2020 Base
Operations ehearsal #3 for commissioning
Dress ehearsal –
Just like it will be April for the actual commissioning.
Dec2020 NCSA
Operations ehearsal data release processing (commission-
ingdata)
Dress ehearsal–JustlikeitwillbeAprilfortheactual
commissioning.
2021
NCSA
Operations ehearsal for data release processing (regular
data).
Feb2022 NCSA/Base
Operations ehearsal
Rehearsalsforrealoperationswhichstart
Oct2022
Sept 2022 NCSA/Base
Operations ehearsal
Full Dress ehearsal for real operations
which start Oct 2022
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
26

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04

Back to top


10 ScienceValidation
10.1 Definition
WedefineDMScienceValidationastheprocessbywhichweassesstheas-builtDataManage-
mentsystemmeetstheneedsofthescientificcommunityandotheridentifiedstakeholders.
Weassesstheprojectedandrealizedscientificusabilityofthesystembyperiodicallyexercis-
ing the integrated system in a way that goes beyond synthetic unit and integration tests and
verification of piece-wise equirements as described in previous sections. In other words, we
attempt to use the system in ways we expect it to be used by the ultimate users of the system,
scientists
. An example may be performing a mock science study on the results of process-
ing of precursor data, or performing a mock science-like activity (e.g., interactive analysis of
time-domain datasets) on a partially stood-up service (e.g., the Notebook aspect of the LSST
SciencePlatform). We ecordandanalyzeanyissuesencounteredinsuchusage, andfeed
this information back to the DM Science and DM development teams.
Science Validation exercises are designed to close the design-build-verify loop, and enable
one to measure the degree to which the equirements, designs, the as-built system, and
futuredevelopmentplanscontinuetosatisfystakeholderneeds.Theyalsoprovidevaluable
feedbackaboutmodificationsneededtoensurethedeliveryofascientificallycapablesystem.
Ultimately,SVactivitiestransferintocommissioningSVactivitiesandprovidetrainingtothe
futuremembersoftheCommissioningteam.
10.2 ScheduleandExecution
10.2.1 Schedule
DM SV activities are planned and prepared in a rolling wave fashion in parallel with devel-
opmentactivities(ona6-monthcycle, orperhapsayear). TheSVactivitieswilltypicallybe
designed so as to exercise the capabilities of the system expected to be delivered at the end
of a given development cycle. These follow a long-term roadmap of SV activities, linked to
product delivery milestones in the DM’s Construction Plan (see the table in Section 6). The
Science Validation (SV) team guides the definition of goals of those activities, in close consul-
tationwiththeDMProjectManager.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
27

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
By their nature, SV activities will typically lag behind deliveries of the (sub)system being veri-
fied–ideally,theywillcommenceimmediatelyupondelivery.PreparatorySVactivities(e.g.,
identificationandacquisitionofsuitabledatasets,identificationofpotentialScienceCollabo-
ration esources to include on the activity, or development of activity-specific analysis codes)
willcommenceasearlyasfeasible. DMSVScientistwillcoordinatetheexecutionofallSV
activities.
SV activities should aim to take no longer than two months to conclude, to enable rapid ac-
tionablefeedbacktoDMManagementandDMSubsystemScience.
10.2.2 Execution
ScienceValidationactivitiestypicallyfollowthesuccessfulexecutionofunitandintegration
test activities described in the previous sections, especially the larger “dress ehearsals” and
“datachallenges”aslistedinSection6(MasterSchedule).
Following successful service stand-up or data challenge execution (at integration and unit
testlevel),thegenerateddataproductsorintegratedservicesareturnedovertotheSVteam.
The SV team performs additional tests and data analyses to exercise the integrated system
and assess its quality relative to expectations for the current phase of construction. This
assessment is fed back to DM Subsystem Science and Systems Engineering teams to inform
them about the status and needed improvements to the system.
Beyond eportingontheresults,theSVteamexaminesthetestsorproceduresdevelopedin
this phase and identifies those that are good new metrics of system quality and could be run
in an automated fashion. These are fed back to the development teams for productizing and
incorporationintotheautomatedQCsystems.
10.3 Deliverables
KeydeliverablesofScienceValidationactivitiesare:
• Reports on the assessed capability of the Data Management System to satisfy stake-
holder needs. The assessments shall take into account the expected maturity of the
systembeingtested.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
28

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
• Recommendationsforimprovementsandchanges,bothinthequalityofas-constructed
systems(i.e.,whatneedstobebuiltdifferentlyorbetter,tomakeitmoreconsistentwith
thesystemvision),aswellastheoverallsystemvision(i.e.,recommendationsonwhere
the vision may need to be modified to fully respond to stakeholder needs).
• Measurements of performance metrics that do not lend themselves to easy automa-
tion (e.g., science activities equiring human involvement, like visual classification, or UX
tests).
• Identificationofnewperformancemetricstobetracked,includingpotentialdeliveries
ofcodetotheDMConstructionand I&Tteamsforinclusioninautomatedqualitycontrol
pipelines.
•OtherdeliverablesaschargedwhencharteringaparticularSVexercise.
10.4 OrganizationandResources
DM Science Validation Team
Institutional Science Leads
DM Science Pipelines Scientist
(Robert Lupton)
DM SST Staff (variable)
DM Staff (detailed)
External Scientists (variable)
DM Science Validation Scientist
FIGURE 4: OrganogramoftheDataManagementScienceValidationTeam. Thegroupis
chaired by the DM Science Validation Scientist, with the DM Science Pipelines Scientist and
Institutional Science Leads making up the permanent membership. Depending on the SV
activities being executed at any given time, the group may draw on additional temporary
members from DM SST Staff, the broader DM Construction staff, as well as external scien-
tists(e.g.,ScienceCollaborationmemberscommittedtoassistingSVgoals).SVmembership
is reassessed on a cycle by cycle basis, with estimates incorporated in the long-term plan.
The DM Subsystem Scientist is accountable to the LSST Project Scientist for successful exe-
cution of DM Science Validation activities. This responsibility is delegated to the
DM Science
ValidationScientist
,wholeadstheScienceValidation(SV)team.
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
29

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
TheSVteamguidesthedefinitionofgoalsandreceivestheproductsofdress ehearsalac-
tivities, consistent with the long-term testing roadmap defined in Section 6. Decisions on
strategic goals of SV exercises are made in close consultation and coordination with the DM
ProjectManagerandSubsystemScientist. TheresultsofSVactivitiesare eportedtotheDM
ProjectManagerandSubsystemScientist.
SVactivitiesdrawon esourcesoftheDMSystemScienceTeam,butmayalsotapintothe
broader construction team if needed (and as jointly agreed upon with the DM Project Man-
ager),aswellascontributorsfromtheLSSTScienceCollaborations.Additionalmembersmay
addedasneeded,dependingonSVactivitiesbeingconsideredandbasedontherecommen-
dationoftheDMSVScientistand esourceconstraints.
TheSVScientist,theDMSciencePipelinesScientist,andall InstitutionalScienceLeadsareex-
officiomembersoftheSVTeam.DMProjectScientistandManagersarenotformalmembers,
but monitor the work of the group.
10.4.1 Example
An example of a Science Validation activity may be as follows:
• Based on the long-term development roadmap and new capabilities expected to be
delivered, the at the beginning of a 6-month cycle the SV Team defines the goals of a
data challenge to be executed at the end of the cycle. For the purposes of this example,
weassumeamajornewfeaturetobedeliveredisastrometriccalibrationandestimation
ofpropermotions.
• A small data release production using HSC data is defined that should result in a data
set sufficient to measure the size and orientation of velocity ellipsoids in the Galactic
halo. If such measurement are a success, they would independently validate the newly
addedglobalastrometriccalibrationandpropermotionmeasurementcapability.
• At the end the development cycle, the Science Pipelines team delivers to the proto-
Operations team a documented and internally tested set of DRP pipelines with the new
capabilities as defined above. The pipelines pass all unit and small-scale integration
tests. The proto-Operations team deploys and e-verifies the received pipelines in the
I&T environment designed to closely mimic the production environment. They verify
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
30

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
thatthepipelineintegrateswellwiththeorchestrationsystemandiscapableofexecut-
ingmedium-to-largescaleprocessing.Thepipelinespassintegrationtests.
• Thedatachallengeisoperationallyplannedandexecutedbytheproto-Operationsteam,
includingtheexecutionofanypredefinedQAmetrics.Thedataproductsandtestresults
are turned over to the Science Validation team.
• The Science Validation team performs the analysis needed to achieve SV exercise goals
(themeasurementofvelocityellipsoids,inthiscase).
• The results and conclusions derived from the data challenge are fed back to the DRP
team, DM Project Management, and DM Subsystem Science; they may be used to as-
sess the overall quality of the product, pass a formal equirement, and/or inform future
constructiondecisions.
• Any newly developed but broadly useful tests are identified as such, and fed to the I&T
teamforinclusionintothebatteryofteststhatarerunona egularbasis.

Back to top


A VerificationMatrix
The DM verification matrix may be found in LSE-61. A subset of the columns from the matrix
aredisplayedheretoindicatehowwewillverifyDM equirements.
Requirement Name
Method
DMS-REQ-0024 Raw ImageAssembly
Demonstration
DMS-REQ-0018 RawScience ImageDataAcquisition
Test
DMS-REQ-0068 RawScience ImageMetadata
Test
DMS-REQ-0022 CrosstalkCorrectedScience ImageDataAcquisition
Test
DMS-REQ-0020 WavefrontSensorDataAcquisition
Test
DMS-REQ-0265 GuiderCalibrationDataAcquisition
Demonstration
DMS-REQ-0004 NightlyDataAccessibleWithin24hrs
Test
DMS-REQ-0069 ProcessedVisit Images
Demonstration
DMS-REQ-0072 ProcessedVisit ImageContent
Demonstration
DMS-REQ-0029 GeneratePhotometricZeropointforVisit Image
Demonstration
DMS-REQ-0030 GenerateWCSforVisit Images
Test
DMS-REQ-0070 GeneratePSFforVisit Images
Demonstration
DMS-REQ-0010 DifferenceExposures
Demonstration
DMS-REQ-0074 DifferenceExposureAttributes
Demonstration
DMS-REQ-0266 ExposureCatalog
Inspection
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
31

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
DMS-REQ-0269 DIASourceCatalog
Demonstration
DMS-REQ-0270 Faint DIASource Measurements
DMS-REQ-0271 DIAObjectCatalog
Demonstration
DMS-REQ-0272 DIAObjectAttributes
Demonstration
DMS-REQ-0273 SSObjectCatalog
Demonstration
DMS-REQ-0317 DIAForcedSourceCatalog
Demonstration
DMS-REQ-0274 AlertContent
Demonstration
DMS-REQ-0097 Level1DataQualityReportDefinition
Demonstration
DMS-REQ-0099 Level1PerformanceReportDefinition
Demonstration
DMS-REQ-0101 Level1CalibrationReportDefinition
Demonstration
DMS-REQ-0267 SourceCatalog
Demonstration
DMS-REQ-0275 ObjectCatalog
Demonstration
DMS-REQ-0276 ObjectCharacterization
Inspection
DMS-REQ-0046 ProvidePhotometricRedshiftsofGalaxies
Inspection
DMS-REQ-0034 AssociateSourcestoObjects
Demonstration
DMS-REQ-0279 DeepDetectionCoadds
Demonstration
DMS-REQ-0280 TemplateCoadds
Demonstration
DMS-REQ-0281 Multi-bandCoadds
Demonstration
DMS-REQ-0278 Coadd ImageAttributes
Demonstration
DMS-REQ-0047 ProvidePSFforCoadded Images
Demonstration
DMS-REQ-0106 Coadded ImageProvenance
Demonstration
DMS-REQ-0277 CoaddSourceCatalog
Demonstration
DMS-REQ-0268 Forced-SourceCatalog
Demonstration
DMS-REQ-0103 Produce ImagesforEPO
Demonstration
DMS-REQ-0130 CalibrationDataProducts
Demonstration
DMS-REQ-0132 Calibration ImageProvenance
Demonstration
DMS-REQ-0059 BadPixelMap
Demonstration
DMS-REQ-0060 BiasResidual Image
Demonstration
DMS-REQ-0061 CrosstalkCorrectionMatrix
Demonstration
DMS-REQ-0282 DarkCurrentCorrectionFrame
Demonstration
DMS-REQ-0063 MonochromaticFlatfieldDataCube
Demonstration
DMS-REQ-0062 IlluminationCorrectionFrame
Demonstration
DMS-REQ-0283 FringeCorrectionFrame
Demonstration
DMS-REQ-0291 QueryRepeatability
Demonstration
DMS-REQ-0292 Uniquenessof IDsAcrossDataReleases
Demonstration
DMS-REQ-0293 SelectionofDatasets
Demonstration
DMS-REQ-0294 ProcessingofDatasets
Demonstration
DMS-REQ-0295 TansparentDataAccess
Demonstration
DMS-REQ-0284 Level-1ProductionCompleteness
Demonstration
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
32

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
DMS-REQ-0131 Calibration ImagesAvailable1HourBeforeObserving
Demonstration
DMS-REQ-0002 TransientAlertDistribution
Demonstration
DMS-REQ-0285 Level1SourceAssociation
Demonstration
DMS-REQ-0286 SSObjectPrecovery
Demonstration
DMS-REQ-0287 DIASourcePrecovery
Demonstration
DMS-REQ-0288 UseofExternalOrbitCatalogs
Demonstration
DMS-REQ-0089 SolarSystemObjectsAvailablewithin24hours
Demonstration
DMS-REQ-0096 GenerateDataQualityReportwithin4hours
Demonstration
DMS-REQ-0098 GenerateDMSPerformanceReportwithin4hours
Demonstration
DMS-REQ-0100 GenerateCalibrationReportwithin4hours
Demonstration
DMS-REQ-0289 CalibrationProductionProcessing
Inspection
DMS-REQ-0006 TimelyPublicationofLevel2DataReleases
Inspection
DMS-REQ-0290 Level3Data Import
Demonstration
DMS-REQ-0119 DAC esourceallocationforLevel3processing
Demonstration
DMS-REQ-0120 Level3DataProductSelfConsistency
Inspection
DMS-REQ-0121 ProvenanceforLevel3processingatDACs
Inspection
DMS-REQ-0125 SoftwareframeworkforLevel3catalogprocessing
Demonstration
DMS-REQ-0128 SoftwareframeworkforLevel3imageprocessing
Demonstration
DMS-REQ-0308 SoftwareArchitecturetoEnableCommunityRe-Use
Demonstration
DMS-REQ-0009 SimulatedData
Demonstration
DMS-REQ-0296 Pre-cursor,andRealData
Demonstration
DMS-REQ-0032 ImageDifferencing
Demonstration
DMS-REQ-0033 ProvideSourceDetectionSoftware
Demonstration
DMS-REQ-0043 ProvideCalibratedPhotometry
Demonstration
DMS-REQ-0042 ProvideAstrometricModel
Demonstration
DMS-REQ-0052 EnableaRangeofShapeMeasurementApproaches
Demonstration
DMS-REQ-0160 ProvideUser InterfaceServices
Demonstration
DMS-REQ-0297 DMS InitializationComponent
Demonstration
DMS-REQ-0155 ProvideDataAccessServices
DMS-REQ-0298 DataProductAccess
Demonstration
DMS-REQ-0299 DataProduct Ingest
Demonstration
DMS-REQ-0300 BulkDownloadService
Demonstration
DMS-REQ-0065 Provide ImageAccessServices
Demonstration
DMS-REQ-0301 ControlofLevel-1Production
Demonstration
DMS-REQ-0156 Provide Pipeline Execution Services
DMS-REQ-0302 ProductionOrchestration
Demonstration
DMS-REQ-0303 ProductionMonitoring
Demonstration
DMS-REQ-0304 ProductionFaultTolerance
Demonstration
DMS-REQ-0158 Provide Pipeline Construction Services
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
33

LARGESYNOPTICSURVEYTELESCOPE
TestPlan
LDM-503
LatestRevision2017-07-04
DMS-REQ-0305 TaskSpecification
Inspection
DMS-REQ-0306 TaskConfiguration
Demonstration
DMS-REQ-0307 UniqueProcessingCoverage
Demonstration
DMS-REQ-0309 RawDataArchivingReliability
Demonstration
DMS-REQ-0094 KeepHistoricalAlertArchive
Demonstration
DMS-REQ-0310 Un-ArchivedDataProductCache
Demonstration
ThecontentsofthisdocumentaresubjecttoconfigurationcontrolbytheLSSTDMTechnicalControlTeam.
34

Back to top