Jobs in DATAmaestro Lake execute an action at a given frequency. The following types of jobs are available at DATAmaestro Lake.
Job | Description |
---|---|
Computed tag | Calculate functions based on other tags within the Lake |
Delayed tag | Delay tag in time |
Export to Analytics | Create a reoccurring export of data from the Lake to an Analytics project. |
Imported Model (Alpha) | Static model in json, exported from an Analytics project. In this case, if the model changes in Analitics project, this exported model doesn't change. Not recommended. |
Model Predict | Calculate the real-time outputs of a predictive model directly linked to an Analytics task |
Moving Average Tag | Calculate the real-time outputs of a predictive model directly linked to an Analytics task |
Script | Multipurpose scripting tag options (Advanced) |
Synchonizer | Synchronize multiple DATAmaestro Lakes. |
Tag Replay | Replay or loop data in the Lake (for the purposes of training or demos) |
E-mail alert | Send email alerts based on predefined rules (email server must be available) |
Batch Feature | Calculate features per batch ("fingerprinting") |
From the jobs page, all jobs can be created, edited or deleted. There is also information on a job’s status: executing (indicates job is currently running), pending (waiting for next execution) or error (an error message will be provided).
From this menu, create jobs for computed tags or exports to DATAmaestro analytics.
Click Manage > Jobs in the menu.
Click + New job button on top right.
Choose a New Job Type.
Click Create.
Enter Job id: /folder/Job Name. Example: /demo/Automotive/Computed Tag/PercentageValue.
Select Scheduling.
Standard: Job will be executed a first time after the initial delay and then execution will occur at regular intervals. This regular interval is defined by the period.
Every day: Job will be executed every day at the same time.
Enter Initial delay, if the initial delay is equal to zero the job starts as soon as the Save button is clicked.
Put Period, time interval between each job execution. Important remark: it is necessary to put the units, example, for a period of 10 minutes: 10m.
Letter | Description |
---|---|
d | day |
h | hour |
m | minute |
s | second |
Period for a Computed tag: The period does not influence the frequency or number of output values written by the Computed Tag job itself, only when the calculation will run. The frequency and number of output values written by the Computed Tag job depend on the input variable(s) frequency and number. For example, a job with a Period set to 24h will run every 24 hours. If the data uses input tags sampled every minute, then it will write values every minute. Each time the job runs, it looks to see the last output value and continues the calculation from that point. Therefore, if the job last ran 24 hours ago, it will write an output value every minute (for minute sampled data) for the last 24 hours. The resulting calculation does not depend on the period.
Put the Class name, this indicates the type of job:
Computed tags: be.pepite.pepito.data.historian.tm.jobs.ComputedTagTask
Exports to Analytics: be.pepite.pepito.data.historian.tm.jobs.ExportToAnalyticsTask
A computed tag calculates functions based on other tags within the Lake.
Two steps to creating a computed tag in DATAmaestro Lake:
Create new tag “shell”.
Create a job to calculate values to fill new tag “shell”.
Introduction to Lake Access rights:
All jobs within DATAmaestro Lake must be saved in a folder that has access to the data they will require to make their calculation.
If a job is created at “Cement Data…” it has access to all tags in the folder “Cement Data” AND any sub folders within Cement Data (i.e., Computed).
However, if a job is created at “Cement Data/Computed”, it ONLY has access to tags in the sub folder “Computed”.
Note: A forward slash “/” denotes the next level in the folder structure.
This step creates a new empty tag within the DMlake, this is the first step required to create computed tags.
From this menu, create jobs for computed tags or exports to DATAmaestro analytics.
Click Manage > Jobs in the menu.
Click + New job button on top right.
Choose a New Job Type: ComputedTagTask
Click Create.
Enter Job id. Note: The job must be in a Folder that has access to the necessary data to be executed.
Click + New Folder
Select Base Folder. Choose the same Base folder as the tags used for the computed tag executing.
Enter New Folder name. Example: computed.
Create a New Folder to put all computed tags in the same folder.
Example of Job Id: /Site A/Line 1/computed/Delta (predict - real)
Select Scheduling.
Enter At time (hours) : (minutes)
Standard: Job will be executed a first time after the initial delay and then execution will occur at regular intervals. This regular interval is defined by the period.
Every day: Job will be executed every day at the same time.
If scheduling Standard, enter Initial delay, if the initial delay is equal to zero the job starts as soon as the Save button is clicked.
Put Period, time interval between each job execution. Important remark: it is necessary to put the units, example, for each 10 minutes : 10m. Period for a Computed tag: The period does not influence the frequency or number of output values written by the Computed Tag job itself, only when the calculation will run. The frequency and number of output values written by the Computed Tag job depend on the input variable(s) frequency and number. For example, a job with a Period set to 24h will run every 24 hours. If the data uses input tags sampled every minute, then it will write values every minute. Each time the job runs, it looks to see the last output value and continues the calculation from that point. Therefore, if the job last ran 24 hours ago, it will write an output value every minute (for minute sampled data) for the last 24 hours. The resulting calculation does not depend on the period.
Class name indicates the type of job. This field is automatically filled.
Computed tags: be.pepite.pepito.data.historian.tm.jobs.ComputedTagTask
Before creating your computed tags create a Computed Folder. For more information, Create New Folder. |
Letter | Description |
---|---|
s | second |
m | minute |
h | hour |
d | day |
Before creating a computed tag, first you need to create a New Tag. The new tag created is empty of data and the calculation performed by the computed tag will fill this tag with data.
Click + New Tag button.
Enter Folder, path is automatically filled.
Enter tag Name.
Enter Title, if required.
Enter Type, Numeric or Symbolic.
Enter Unit, if required.
Enter Description, if required.
Click Save.
Select Computed Tag. Note: You can only select tags in the folder or subfolders of the job.
Select Input tags. Note: You can only select tags in the folder or subfolders of the job.
Compute latest value checkbox is checked by default. Lake allows the insertion of several values at a same timestamp. By default, the computed tag is not going to calculate a new value for the last input since it is not sure whether it is going to have another value at the same timestamp. Therefore the computed tag can have one late value. If Compute latest value equals to true it is possible to indicate to the computed tag that it can calculate the last input even if there is a new value at the same timestamp that can be inserted later.
Write the script. The script must be written in Javascript.
Click Save.
Click Truncate to remove data of the computed tags from a particular date onwards.
In the example below, the new computed tag Percentage of bad units simply transforms the variable Ratio of bad units into a percentage value by multiplying it by 100.
In this tab it is possible to edit all the properties of the job in a JSON format. For computed tags the properties that can be edited are:
output tags
input tags
script function
Compute latest value = false
output tags
input tags
script function
Compute latest value = true (check the image below, still is Optimistic when true)
If for example, you would like to calculate a computed tag with data that is uploaded every minute, however, there is a laboratory measurement that is uploaded only once per day. In this case, you can decide to save this same value during 24 hours (tag validity = 24h). In Advanced properties write the script below. In this example, there is a 3 days validity. Warning: This option should only be used with highly rare data measurements. Please contact PEPITe support for more information.
{ "outputTag": "demo/tagAB", "tagValidityMap": { "/demo/tagA": 259200000, "/demo/tagB": 259200000 }, |
From this menu, shifts a tag backwards or forward in time.
Click Manage > Jobs in the menu.
Click + New job button on top right.
Choose a New Job Type: DelayedTagTask.
Click Create.
In Class name indicates the type of job. This field is automatically filled.
Delayed tags: be.pepite.pepito.data.historian.tm.jobs.DelayedTagTask
{ "delay": -3600000, // delayed 1 h "suffix": "_delayed", "tags": [ "/folder/site/Tag Name" ], "isOptimistic": true } |
From this menu, create jobs to export data to DATAmaestro analytics.
In Class name indicates the type of job. This field is automatically filled.
Delayed tags: be.pepite.pepito.data.historian.tm.jobs.ExportToAnalyticsTask
Enter DMFF file name, example : nameDMFFfile_rawextract.dmff. Hint: Search the DMFF file name in Data sources of your DATAmaestro project.
DMFF file name, Project id and Task Id are very important information for the job export, they allow the job to automatically update the project data source and, consequently, all tasks in DATAmaestro. |
Note on DMFF file the will replace an existant DMFF both with the same name. This means that if the same file is used in two different Analytics projects (for example, if you create a copy of the project) and if the DMFF is updated in only one project, but both projects have the same file name, the DMFF file will be automatically updated in the other project too. |
In this tab it is possible to edit all the properties of the job in a JSON format. For export to DATAmaestro the properties that can be edited are:
From this menu, create an imported model created in Analytics and exported in JSON format.
In this tab it is possible to edit all the properties of the job in a JSON format. For model tag task the properties that can be edited are:
ISHM indicates when it encounters an unknown set of operating conditions. In the example below, the ISHM model was not trained on Chemical type = II. You can see that there is an indication of Unknown Condition(s): C - Chemical type = II. |
From this menu, create a model predict job of a model created in Analytics.
In this tab it is possible to edit all the properties of the job in a JSON format. For model predict task the properties that can be edited are:
From this menu, create a simple moving average calculated by taking the average of a set of values over a defined period length. It is a technique used to smooth a curve, filter out the noise.
In this tab it is possible to edit all the properties of the job in a JSON format. For moving average tag task the properties that can be edited are:
From this menu, create a script tag.
In this tab it is possible to edit all the properties of the job in a JSON format.
Note: Examples of script tags will be added soon.
From this menu, copy tags from a remote Lake.
Synchronizer Task: be.pepite.pepito.data.historian.tm.jobs.SynchronizerTask
The script must contain:
{ "password": "admin", "remotePath": [ "/demo/computed2" ], "recurse": false, "serverURL": "http://localhost:8888", "disabled": false, "user": "admin" } |
From this menu, it is possible to repeat the tags in another period of time.
{ "endDate": "2005-08-19T10:00:00Z", // original end date "replayDate": "2018-11-01T00:00:00Z", // start date used to replay date "startDate": "2005-08-18T13:00:00Z" // original start date } |
From this menu, send an email alert based on a condition.
In this tab it is possible to edit all the properties of the job in a JSON format. For mail alert task the properties that can be edited are:
Click Admin > Tasks (Jobs) in the menu.
Click Edit icon.
Make changes.
Click Save.
Check last values and dates within Lake Jobs. |