Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Jobs

Jobs in DATAmaestro Lake execute an action at a given frequency. The following types of jobs are available at DATAmaestro Lake. 

...

  1. Enter Job id:  /folder/Job Name. Example: /demo/Automotive/Computed Tag/PercentageValue.

  2. Select Scheduling.

    1. Standard Job will be executed a first time after the initial delay and then execution will occur at regular intervals. This regular interval is defined by the period.

    2. Every day Job will be executed every day at the same time.

  3. Enter Initial delay, if the initial delay is equal to zero the job starts as soon as the Save button is clicked.

  4. Put Period, time interval between each job execution. Important remark: it is necessary to put the units, example, for a period of 10 minutes: 10m. 

    Letter 

    Description

    d

    day

    h

    hour

    m

    minute

    s

    second

    Period for a Computed tag: The period does not influence the frequency or number of output values written by the Computed Tag job itself, only when the calculation will run. The frequency and number of output values written by the Computed Tag job depend on the input variable(s) frequency and number. For example, a job with a Period set to 24h will run every 24 hours. If the data uses input tags sampled every minute, then it will write values every minute. Each time the job runs, it looks to see the last output value and continues the calculation from that point. Therefore, if the job last ran 24 hours ago, it will write an output value every minute (for minute sampled data) for the last 24 hours.  The resulting calculation does not depend on the period. 

  5. Put the Class name, this indicates the type of job:

    1. Computed tags: be.pepite.pepito.data.historian.tm.jobs.ComputedTagTask

    2. Exports to Analytics: be.pepite.pepito.data.historian.tm.jobs.ExportToAnalyticsTask

...

Info
titleDMFF file name

DMFF file name, Project id  and Task Id are very important information for the job export, they allow the job to automatically update the project data source and, consequently, all tasks in DATAmaestro. 


Info
titleWarning

Note on DMFF file the will replace an existant DMFF both with the same name. This means that if the same file is used in two different Analytics projects (for example, if you create a copy of the project) and if the DMFF is updated in only one project, but both projects have the same file name, the DMFF file will be automatically updated in the other project too.  


On Advanced tab: 

In this tab it is possible to edit all the properties of the job in a JSON format. For export to DATAmaestro the properties that can be edited are: 

...

  • Condition (Tags, condition and value) 
  • Recipients (mails)
  • Alert duration 
  • Alert title 
  • Throttling period 



Info
titleBe notified by email when data stops arriving in the Lake

Image Added

Edit Jobs

  1. Click Admin > Tasks (Jobs)  in the menu.  

  2. Click Edit icon.

  3. Make changes.

  4. Click Save.

...