You are here: CLASSE Wiki>HEP/SWIG Web>CmsDbs>DbsRequirements>DbsMoreUseCases (28 Nov 2006, AndrewDolgert)Edit Attach

More Use Cases for the DBS


  • Prodagent - on CMS Wiki - Already running using DBS. Not looking for more use cases.
  • CRAB - on CMS Wiki - Same as Prodagent.
  • ProdRequest - wiki page not found - People use Prodrequest to ask for MC or analysis jobs to be done at another location.
  • JobRobot - on CMS Wiki - The JobRobot may contact the DBS to select data samples to use for synthetic loads it submits. At the end of a run, it won't inject results into the DBS because it throws them away.
  • Tools Written by Tier 0 Operational Managers - What are these tools?
  • PhEDEx - on CMS Wiki - This needs to see what file blocks exist and what is their status.
  • Data Discovery Web Page
  • Maintenance Tools for Local and Global DBS

Usage Narratives

Tier 2 PhEDEx Operator Keeps Data Store Consistent with Tier 1

People monitor the Data Discovery Web Page to see when new file blocks have arrived from Tier 1. When they see the new blocks, they tell PhEDEx to transfer those blocks to Tier 2.

Use Cases

Breakdown of Use Cases

Combine Prodagent and CRAB, showing differences as variation points in the use case.

Prodagent and CRAB

Run an Analysis Job

  1. Prodagent checks Global DBS for input dataset file block ids.
  2. DLS tells Prodagent where the LFNs corresponding to those file blocks are located.
  3. Global DBS provides dataset LFNs corresponding to requested lumi sections and the list of lumi sections.
  4. Prodagent records tracked parameters from configuration files in parameter database.
  5. Prodagent creates processed dataset in Prodagent-Local DBS. It provides the input processed or analysis dataset with lumi sections, the application with the version and a pointer to the tracked parameters.
  6. Prodagent constructs files for grid submission.
  7. Appropriate Grid manager executes job at requested computing site.
  8. As jobs return, Prodagent puts the files into Prodagent-Local DBS, assocating them with the output processed dataset. When there are enough files, it consolidates them inot file blocks and registers these with the Prodagent-Local DBS.
  9. Prodagent registers completed job in Global DBS or Local DBS, as requested. It gets this information from the Prodagent-Local DBS.

API Calls
In file COMP/Prodagent/Client/Python/
  • createPrimaryDataset(primdataset)
  • createProcessedDataset(dataset)
  • createProcessing(processing)
  • getDatasetFileBlocks(datasetPath)
  • createFileBlock(block)
  • insertFiles(fileblock,fileList)
  • insertEventCollections(dataset, eventCollectionList)
  • getDatasetContents(dbspath)
  • getDatasetFileBlocks(dbspath)


JobRobot Queries DBS and DLS to Discover Datasets

  1. JobRobot supplies a pattern describing the desired datasets. (What pattern? What kinds of datasets does it look for?)
  2. DBS responds with the number of events in all datasets matching the given pattern. It returns identifiers so that the DLS can find the file blocks associated with each dataset.
  3. Given a list of file blocks, the DLS locates sites for each file block.
  4. The JobRobot lists each dataset, site, and number of events.
API Calls

  • listProcessedDatasets(pattern)
  • get(datasetPathName)
  • getDatasetContents(dataset)
  • DbsApiException, DbsCgiToolError, DbsCgiBadResponse, InvalidTier

-- AndrewDolgert - 28 Nov 2006
Topic revision: r1 - 28 Nov 2006, AndrewDolgert
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding CLASSE Wiki? Send feedback