VarValues Collection
1. Query variable instance values
To get variable values you can query for general purpose / global view entries, realtime entries or historic raw data using custom views. The queries differ only by the view_type (GlobalView | RealtimeView | CustomView).
Parameters used
- VIEW_INDEX: number // needed for custom views
- VARIABLE_INSTANCE_ID: number // referencing the variable instance id you want the values from
- FROM: number // range begin as unix UTC timestamp. View time range begin will be used if field is empty
- TO: number // range end as unix UTC timestamp. View time range end will be used if field is empty
- RASTER: (Milli_1 | Milli_2 | Milli_5 | Milli_10 | Milli_20 | Milli_25 | Milli_50 | Milli_100 | Milli_200 | Milli_250
| Milli_500 | Second_1 | Second_2 | Second_5 | Second_10 | Second_15 | Second_30 | Minute_1 | Minute_2 | Minute_5
| Minute_10 | Minute_15 | Minute_30 | Hour | Day | Week | Month | Year | Overall ) // Uses default view raster if empty
- AGGREGATION: (Last | LastValid | Avg | Sum | Min | Max | BitOr | BitAnd) // Uses default variable aggegation if empty
// there’s always a minimum raster on the server that you MUST NOT undercut, it can be queried using the RawValuesView query.
// You can leave optional fields amrked with `*` empty.
GET Request URLs
You can request values for ONE Variable Instance over following GET requests. Depending on type of request you can get native EOS compressed data stream of VarValues, or uncompressed Double value stream
GlobalView
// compressed data stream of VarValues
variable/instance/varvalues/get/view_type=GlobalView&view_index=&id=[VARIABLE_INSTANCE_ID]&from=[*FROM]&to=[*TO]&raster=[*RASTER]&aggr=[*AGGREGATION]
// Double values stream
variable/instance/dvals/get/view_type=GlobalView&view_index=&id=[VARIABLE_INSTANCE_ID]&from=[*FROM]&to=[*TO]&raster=[*RASTER]&aggr=[*AGGREGATION]
RealtimeView
// compressed data stream of VarValues
variable/instance/varvalues/get/view_type=RealtimeView&view_index=&id=[VARIABLE_INSTANCE_ID]&from=[*FROM]&to=[*TO]&raster=[*RASTER]&aggr=[*AGGREGATION]
// Double values stream
variable/instance/dvals/get/view_type=RealtimeView&view_index=&id=[VARIABLE_INSTANCE_ID]&from=[*FROM]&to=[*TO]&raster=[*RASTER]&aggr=[*AGGREGATION]
CustomView
// compressed data stream of VarValues
variable/instance/varvalues/get/view_type=CustomView&view_index=[VIEW_INDEX]&id=[VARIABLE_INSTANCE_ID]&from=[*FROM]&to=[*TO]&raster=[*RASTER]&aggr=[*AGGREGATION]
// Double values stream
variable/instance/dvals/get/view_type=CustomView&view_index=[VIEW_INDEX]&id=[VARIABLE_INSTANCE_ID]&from=[*FROM]&to=[*TO]&raster=[*RASTER]&aggr=[*AGGREGATION]
POST Request URLs
You can request values for multiple Variable Instances over following POST requests. Depending on type of request you can get native EOS compressed data stream of VarValues, or uncompressed Double value stream.
// compressed data stream of VarValues
variable/instance/varvalues/get
// Double values stream
variable/instance/dvals/get
JSON POST Request Object:
{
view_type = "GlobalView", // View Type Enum: RealtimeView, GlobalView or CustomView
view_index = null, // CustomView index if need
from = 1577833200000, // UTC timestamp of request inverval begin (01.01.2020 00:00 MEZ)
to = 1614174000000, // UTC timestamp of request inverval end (25.02.2021 00:00 MEZ)
raster = "Minute_5", // request raster
instances = [
{
instance_id = 4345, // ID of variable instance
aggregation = "Avg" // custom aggregation type
},
{
instance_id = 4672, // ID of variable instance
aggregation = null // use default variable aggregation
}
]
}
Current and Last Valid Values
You can get current values (Variable Instace VarValue) on last available timestamp use POST Request:
// returns array if VarValues
variable/instance/currentvalues/get/view_type=[VIEW_TYPE]&view_index=[*VIEW_INDEX]
and POST Json Object with list if Variable Instance IDs:
{
512,
412,
...
11
}
In some cases current request values will return NaN values if there is now valid value at current time. To get last valid values within MAX_AGE interval use POST Request:
// returns array if VarValues
variable/instance/last_valid_varvalues/get/view_type=[VIEW_TYPE]&view_index=[*VIEW_INDEX]&max_age=[MAX_AGE]
and POST Json Object with list if Variable Instance IDs:
{
512,
412,
...
11
}
Native data stream response
EOS native compressed VarValues data stream
{
data_stream: string // values ByteStream as base64 encoded byte array / array buffer
}
Example
{
"data_stream": "AAAAjgEAjgAAAAF2bdXZIH/4AAAAAAAAv/AAAAAAAAADAQAAAAAADbug"
}
Double values response
Double values data stream
{
from: UTC timestamp of first value begin
to: UTC timestamp of last value end
values_bytes: values ByteStream as base64 encoded byte array. Contains 8 bytes per double value
}
Timespan between all double values have same size, so you can calculate begin and end timestemp of each Double value can be calculated as follows:
double[] values = ...;
long timespan = (to - from) / values.length;
for(int i = 0; i < values.length; i++) {
double dval = values[i];
long dvalBeginTimestamp = from + i * timespan;
long dvalEndTimestamp = dvalBeginTimestamp + timespan;
}
You can use only rasters below one Hour to query double values becouse of floating timespan at rasters above one Hour raster.
2. Measured values
Measured values are a special type of VarValues that can be stored in persisted databases, such as RAW Values Database and Archive Values Database. You have here two options:
a) to import measured values
You can import lage amount of data for long time periode in a past. This operation is heavy wighted, so it is recomended to import not more than 5.000.000 values at once. You need also to have administrator rights to perform this operation.
Request POST URL:
measurements/import/measured_values_map
Example JSON Request:
[
{
instance_id = 4534, // ID of variable instance to import values. Variable instance needs to be of PLC type.
values = [
{
start_timestamp = 1577833200000, // (01.01.2020 00:00 MEZ)
end_timestamp = 1577833201000, // (01.01.2020 00:01 MEZ)
value = 456.3553466, // double value
quality = 1.0 // quality value. 1.0 - Good, 0.0 Bad
},
{
start_timestamp = 1577833201000, // (01.01.2020 00:01 MEZ)
end_timestamp = 1577833202000, // (01.01.2020 00:02 MEZ)
value = 458.2362566,
quality = 1.0
},
{
start_timestamp = 1577833210000, // (01.01.2020 00:10 MEZ)
end_timestamp = 1577833260000, // (01.01.2020 01:00 MEZ)
value = 0.0,
quality = 1.0
}
]
},
{
instance_id = 4537,
values = [...]
}
}
You can also import predicted measured values for some time period in future. This request will ignore all values before PLC Now and after ValuesBounds End Timestamp. By moving PLC Now timestamp forwart, newly incommed measured values will override predicted measured values. You also need to have administrator rights to perform this operation.
Request POST URL:
measurements/import/predicted_measured_values_map
b) to append measured values
You can append measured values to the end of EOS database. This operation will accept only measured values close to current timestamp and move values view end timestamp if need. This operation is internaly more efficient than importing of measured values and should be used if you want to store currently measured values.
Request POST URL:
measurements/append/measured_values_map/sync_exec=[true or false]
JSON Request format is the same as for importing measured values. You can set sync_exec flag to true if you want to read stored values or some calculated values affected by change. It is recomended to set sync_exec flag to false if not attached read will follow.
3. Parsing of EOS native compressed VarValues data stream
To query parse values you first will need to do some ground work.
Datasets that eos provide often contain lots of values and quickly can reach a data-size of 500MB and more. To decrease the data foodprint and to ensure optimal performance our data is stripped to the smallest needed information, serialized into a byte array to be transmitted to the receiver as base64, where then it has to be reconstructed again.
So to handle the variable/instance/varvalues/get
response several steps are needed. We will show the process using typescript code.
Note
We can also provide you with java code on request
First you need to know that the data comes base64 encoded as a byte array that we will need to handle in the next step. So first we need to unwrap the received data and build us a unsinged int8 array.
Typescript code would transform the input data like this:
const binaryString = atob(base64data);
const len = binaryString.length;
const bytes = new Uint8Array(len);
for (let i = 0; i < len; i++) {
bytes[i] = binaryString.charCodeAt(i);
}
Example result
Given data = "AAAAjgEAjgAAAAF2bdXZIH/4AAAAAAAAv/AAAAAAAAADAQAAAAAADbug"
we would get.
bytes = Uint8Array(42) [0, 0, 0, 142, 1, 0, 142, 0, 0, 0, 1, 118, 109, 213, 217, 32, 127, 248, 0, 0, 0, 0, 0, 0, 191, 240, 0, 0, 0, 0, 0, 0, 3, 1, 0, 0, 0, 0, 0, 13, 187, 160]
Next what we do is transform the byte array into a DataStream object, in typescript we are using datastream-js.
Using a DataStream object we can then easily sequentially read different byte lengths using calls like readFloat64
for double values, readUint16
for char values, and so forth.
Note
Look for our get()
, getInt()
, getLong()
, getDouble()
, getChar()
methods to see the datastream access in action.
const dataStream = new DataStream(bytes);
and pass this datastream to a newly created value collection.
const collection = new _rest-transactions();
collection.ds = dataStream; collection.size = collection.getInt();
return collection;
Var Value Collection
The value collection is a wrapper to parse the datastream that is holding a very minimalistic optimized form of all relevant values to keep the data transmission footprint as low as possible.
To achieve this all data fields are only as big as the data needed and grouped into clusters. Each data cluster is prepended by a header that tells us which type and which size of data it holds and is structured in the following order (all data written in big endian)
Header
* cluster type – Uint8 - (0 = sequence, 1 = identical)`
* header value
* size – char (uint16)
* value type – Uint8 – (0 = VarValue, 1 = StoredVardValue, 2 = ArchiveValue, 3 = MeasuredValue, 4 = Supplement value)
* value depending on type
* time delta (ONLY IF cluster size > 1) – long (int64)
Value types
VarValue
- endTimestamp – long (int64)
- value – double (float64)
- quality – double (float64)
- plausibility – Uint8 (Ok, UpperLimit, LowerLimit, Missing)
- value source – Uint8 (None, NaN, Archive, Calculated, Model, Measured, Supplement, ObjectParameter, Mixed)
StoredVarValue
- startTimestamp – long (int64)
- endTimestamp – long (int64)
- value – double (float64)
- quality – double (float64)
- plausibility – Uint8 (Ok, UpperLimit, LowerLimit, Missing)
- value source – Uint8 (None, NaN, Archive, Calculated, Model, Measured, Supplement, ObjectParameter, Mixed)
ArchiveValue
- startTimestamp – long (int64)
- endTimestamp – long (int64)
- value – double (float64)
- quality – double (float64)
- plausibility – Uint8 (Ok, UpperLimit, LowerLimit, Missing)
MeasuredValue
- startTimestamp – long (int64)
- endTimestamp – long (int64)
- value – double (float64)
- quality – double (float64)
SupplementValue
- startTimestamp – long (int64)
- endTimestamp – long (int64)
- value – double (float64)
- plausibility – Uint8 (Ok, UpperLimit, LowerLimit, Missing)
- recordedTimestamp - long (int64)
- userId – int (int32)
If the cluster type is defined as “Sequence” each following entry in the cluster only contains the numeric value formatted as double (float64) without any further information. Instead of getting all properties, the header value is cloned and only the raw value and the timestamp need to be adjusted. To get the proper timestamp you have to count each read value in a clusterIndex counter and multiply that value by the timedelta constant that was previously provided in the header.
If the cluster type is set as “Identical” you need to create an amount of “cluster size” copies of the previously provided value from the header. Also you need to calculate each timestamp like mentioned above.
Keep repeating above steps till your clusterIndex counter reaches the end / equals to the cluster size and then check if there is another cluster available. If so, repeat as above.
To further see the workings of the value collection we provide you with an example typescript implementation. Please be sure to implement an equivalent in your programming language.
/*******************************************************************************
* Copyright (c) 2019 © Elpro GmbH, Berlin. All rights reserved.
*
* Contributors: Sebastian Wojtowicz, Wladimir Degtjarew - API and implementation
*******************************************************************************/
// tslint:disable:no-bitwise
import {HashableObject, imul, smi} from '@/_helpers';
import DataStream from 'datastream-js';
export enum Plausibility {
Ok, UpperLimit, LowerLimit, Missing
}
export enum ValueSource {
None, NaN, Archive, Calculated, Model, Measured, Supplement, ObjectParameter, Mixed
}
export enum ClusterType {
Sequence, Identical
}
export enum IVarValueType {
VarValue,
StoredVarValue,
ArchiveValue,
MeasuredValue,
SupplementValue
}
enum DurationCurveType {
Increment, Decrement
}
export class VarValueCollection extends HashableObject implements IterableIterator<IVarValue> {
static readonly TWO_PWR_16_DBL = 1 << 16;
static readonly TWO_PWR_32_DBL = VarValueCollection.TWO_PWR_16_DBL *
VarValueCollection.TWO_PWR_16_DBL;
ds: DataStream;
private size: number;
private index = 0;
private clusterType: ClusterType;
private clusterSize: number;
private clusterIndex: number;
private timeDelta: number;
private headerValue: IVarValue;
private counter = 0;
constructor() {
super();
}
static create(byteArray: Uint8Array) {
const collection = new VarValueCollection();
collection.ds = new DataStream(byteArray);
collection.size = collection.getInt();
return collection;
}
[Symbol.iterator](): IterableIterator<IVarValue> {
this.reset();
return this;
}
public reset() {
this.index = 0;
this.counter = 0;
this.clusterIndex = 0;
this.headerValue = null;
this.ds.seek(0);
this.size = this.getInt();
}
public next(): IteratorResult<IVarValue> {
const done = !this.hasNext();
let value = null;
if (!done) {
value = this.nextIVarValue();
}
return {
done,
value
};
}
hashCode(): number {
const bufferSize = this.ds.byteLength;
if (bufferSize === Infinity) {
return 0;
}
let h = 0;
for (let i = 0, counter = 0; i < bufferSize; i++) {
h = (h + this.hashNumber(this.ds.buffer)) | 0;
}
return this.murmurHashOfSize(bufferSize, h);
}
private murmurHashOfSize(size, h) {
h = imul(h, 0xcc9e2d51);
h = imul((h << 15) | (h >>> -15), 0x1b873593);
h = imul((h << 13) | (h >>> -13), 5);
h = ((h + 0xe6546b64) | 0) ^ size;
h = imul(h ^ (h >>> 16), 0x85ebca6b);
h = imul(h ^ (h >>> 13), 0xc2b2ae35);
h = smi(h ^ (h >>> 16));
return h;
}
private hashNumber(n) {
if (n !== n || n === Infinity) {
return 0;
}
let hash = n | 0;
if (hash !== n) {
hash ^= n * 0xffffffff;
}
while (n > 0xffffffff) {
n /= 0xffffffff;
hash ^= n;
}
return smi(hash);
}
getSize(): number {
return this.size;
}
getLong(): number {
const i1 = this.ds.readInt32(DataStream.BIG_ENDIAN);
const i0 = this.ds.readInt32(DataStream.BIG_ENDIAN);
return i1 * VarValueCollection.TWO_PWR_32_DBL + (i0 >>> 0);
}
getDouble(): number {
return this.ds.readFloat64(DataStream.BIG_ENDIAN);
}
getChar(): number {
return this.ds.readUint16(DataStream.BIG_ENDIAN);
}
get(): number {
return this.ds.readUint8();
}
getInt() {
return this.ds.readInt32(DataStream.BIG_ENDIAN);
}
hasNext(): boolean {
const hasMore: boolean = !this.ds.isEof()
|| this.clusterIndex < this.clusterSize;
const isEnd: boolean = this.size === this.index;
if (hasMore === isEnd) {
throw new Error('Something is wrong here.
Expected and actual length are not the same!');
}
return hasMore;
}
nextIVarValue(): IVarValue {
this.counter++;
let val: IVarValue;
if (this.headerValue == null) {
this.clusterType = this.get() as ClusterType;
this.clusterSize = this.getChar();
this.clusterIndex = 0;
this.headerValue = AbstractVarValue.read(this);
if (this.clusterSize > 1) {
this.timeDelta = this.getLong();
}
}
if (this.clusterIndex === 0) {
val = this.headerValue;
} else {
switch (this.clusterType) {
case ClusterType.Identical:
val = this.headerValue.copyWithTimeShift(this.timeDelta *
this.clusterIndex);
break;
case ClusterType.Sequence:
const value = this.getDouble();
val = this.headerValue.copyWithValueAndTimeShift(value,
this.timeDelta * this.clusterIndex);
break;
default:
throw new Error('Unsupported Cluster Type');
}
}
this.clusterIndex++;
if (this.clusterIndex === this.clusterSize) {
this.headerValue = null;
}
this.index++;
return val;
}
toArray() {
const array: IVarValue[] = new Array(this.size);
let i = 0;
for (const val of this) {
array[i++] = val;
}
return array;
}
}
export interface IVarValue {
getEndTimestamp(): number;
getValue(): number;
getQuality(): number;
getPlausibility(): Plausibility;
getValueSource(): ValueSource;
getType(): IVarValueType;
copyWithTimeShift(timeShift: number): IVarValue;
copyWithValueAndTimeShift(value: number, timeShift: number): IVarValue;
isValid(): boolean;
}
export abstract class AbstractVarValue extends HashableObject implements IVarValue {
public static read(buffer: VarValueCollection): any {
const type = buffer.get();
switch (type) {
case IVarValueType.VarValue:
return VarValue.read(buffer);
case IVarValueType.StoredVarValue:
return StoredVarValue.read(buffer);
case IVarValueType.ArchiveValue:
return ArchiveValue.read(buffer);
case IVarValueType.MeasuredValue:
return MeasuredValue.read(buffer);
case IVarValueType.SupplementValue:
return SupplementValue.read(buffer);
default:
throw new Error('Unsupported IVarValueType');
}
}
static copy(val: IVarValue, currentTimestamp: number) {
return new VarValue(currentTimestamp, val.getValue(), val.getQuality(),
val.getPlausibility(), val.getValueSource());
}
copyWithTimeShift(timeShift: number): IVarValue {
return undefined;
}
copyWithValueAndTimeShift(value: number, timeShift: number): IVarValue {
return undefined;
}
getEndTimestamp(): number {
return 0;
}
getPlausibility(): Plausibility {
return undefined;
}
getQuality(): number {
return 0;
}
getType(): IVarValueType {
return undefined;
}
getValue(): number {
return 0;
}
getValueSource(): ValueSource {
return undefined;
}
isValid(): boolean {
const valid = Number.isFinite(this.getValue())
&& this.getQuality() > 0
&& this.getPlausibility() === Plausibility.Ok;
return valid;
}
}
export class VarValue extends AbstractVarValue {
protected endTimestamp: number;
protected value: number;
protected quality: number;
protected plausibility: Plausibility;
protected source: ValueSource;
constructor(endTimestamp: number, value: number, quality: number,
plausibility: Plausibility, valueSource: ValueSource) {
super();
this.endTimestamp = endTimestamp;
this.value = value;
this.quality = quality;
this.plausibility = plausibility;
this.source = valueSource;
}
public static read(buffer: VarValueCollection): VarValue {
return new VarValue(
buffer.getLong(),
buffer.getDouble(),
buffer.getDouble(),
buffer.get(),
buffer.get()
);
}
public getType(): IVarValueType {
return IVarValueType.VarValue;
}
public getEndTimestamp(): number {
return this.endTimestamp;
}
public getValue(): number {
return this.value;
}
public getQuality(): number {
return this.quality;
}
public getPlausibility(): Plausibility {
return this.plausibility;
}
public getValueSource(): ValueSource {
return this.source;
}
public copyWithTimeShift(timeShift: number): IVarValue {
return new VarValue(this.endTimestamp + timeShift, this.value,
Number.isFinite(this.value) ? this.quality : 0, this.plausibility,
this.source);
}
public copyWithValueAndTimeShift(value: number, timeShift: number): IVarValue {
return new VarValue(this.endTimestamp + timeShift, value,
Number.isFinite(value) ? this.quality : 0, this.plausibility, this.source);
}
}
export class ArchiveValue extends AbstractVarValue {
protected startTimestamp: number;
protected endTimestamp: number;
protected value: number;
protected quality: number;
protected plausibility: Plausibility;
constructor(startTimestamp: number, endTimestamp: number, value: number,
quality: number, plausibility: Plausibility) {
super();
this.startTimestamp = startTimestamp;
this.endTimestamp = endTimestamp;
this.value = value;
this.quality = quality;
this.plausibility = plausibility;
}
public static read(buffer: VarValueCollection): ArchiveValue {
return new ArchiveValue(
buffer.getLong(),
buffer.getLong(),
buffer.getDouble(),
buffer.getDouble(),
buffer.get()
);
}
public getType(): IVarValueType {
return IVarValueType.ArchiveValue;
}
public getStartTimestamp(): number {
return this.startTimestamp;
}
public getEndTimestamp(): number {
return this.endTimestamp;
}
public getValue(): number {
return this.value;
}
public getQuality(): number {
return this.quality;
}
public getPlausibility(): Plausibility {
return this.plausibility;
}
public copyWithTimeShift(timeShift: number): IVarValue {
return new ArchiveValue(this.startTimestamp + timeShift,
this.endTimestamp + timeShift, this.value,
Number.isFinite(this.value) ? this.quality : 0, this.plausibility);
}
public copyWithValueAndTimeShift(value: number, timeShift: number): IVarValue {
return new ArchiveValue(this.startTimestamp + timeShift,
this.endTimestamp + timeShift, value,
Number.isFinite(value) ? this.quality : 0, this.plausibility);
}
getValueSource(): ValueSource {
return ValueSource.Archive;
}
}
export class MeasuredValue extends AbstractVarValue {
protected startTimestamp: number;
protected endTimestamp: number;
protected value: number;
protected quality: number;
protected plausibility = Plausibility.Ok;
constructor(startTimestamp: number, endTimestamp: number, value: number,
quality: number) {
super();
this.startTimestamp = startTimestamp;
this.endTimestamp = endTimestamp;
this.value = value;
this.quality = quality;
}
public static read(buffer: VarValueCollection): MeasuredValue {
const measuredValue = new MeasuredValue(
buffer.getLong(),
buffer.getLong(),
buffer.getDouble(),
buffer.getDouble()
);
measuredValue.plausibility = Object.values(Plausibility)[buffer.get()]
as Plausibility;
return measuredValue;
}
public getType(): IVarValueType {
return IVarValueType.MeasuredValue;
}
public getStartTimestamp(): number {
return this.startTimestamp;
}
public getEndTimestamp(): number {
return this.endTimestamp;
}
public getValue(): number {
return this.value;
}
public getQuality(): number {
return this.quality;
}
public getPlausibility(): Plausibility {
return this.plausibility;
}
public copyWithTimeShift(timeShift: number): IVarValue {
const measuredValue = new MeasuredValue(this.startTimestamp + timeShift,
this.endTimestamp + timeShift, this.value,
Number.isFinite(this.value) ? this.quality : 0);
measuredValue.plausibility = this.plausibility;
return measuredValue;
}
public copyWithValueAndTimeShift(value: number, timeShift: number): IVarValue {
const measuredValue = new MeasuredValue(this.startTimestamp + timeShift,
this.endTimestamp + timeShift, value,
Number.isFinite(value) ? this.quality : 0);
measuredValue.plausibility = this.plausibility;
return measuredValue;
}
getValueSource(): ValueSource {
return ValueSource.Measured;
}
}
export class StoredVarValue extends VarValue {
protected startTimestamp: number;
constructor(startTimestamp: number, endTimestamp: number, value: number,
quality: number, plausibility: Plausibility, valueSource: ValueSource) {
super(endTimestamp, value, quality, plausibility, valueSource);
this.startTimestamp = startTimestamp;
}
public static read(buffer: VarValueCollection): StoredVarValue {
return new StoredVarValue(
buffer.getLong(),
buffer.getLong(),
buffer.getDouble(),
buffer.getDouble(),
buffer.get(),
buffer.get()
);
}
public getType(): IVarValueType {
return IVarValueType.StoredVarValue;
}
public getStartTimestamp(): number {
return this.startTimestamp;
}
public copyWithTimeShift(timeShift: number): IVarValue {
return new StoredVarValue(this.startTimestamp + timeShift,
this.endTimestamp + timeShift, this.value,
Number.isFinite(this.value) ? this.quality : 0, this.plausibility,
this.source);
}
public copyWithValueAndTimeShift(value: number, timeShift: number): IVarValue {
return new StoredVarValue(this.startTimestamp + timeShift,
this.endTimestamp + timeShift, value,
Number.isFinite(value) ? this.quality : 0, this.plausibility, this.source);
}
}
export class SupplementValue extends AbstractVarValue {
protected startTimestamp: number;
protected endTimestamp: number;
protected value: number;
protected plausibility: Plausibility;
protected recordedTimestamp: number;
protected userId: number;
constructor(startTimestamp: number, endTimestamp: number, value: number,
plausibility: Plausibility, recordedTimestamp: number, userId: number){
super();
this.startTimestamp = startTimestamp;
this.endTimestamp = endTimestamp;
this.value = value;
this.plausibility = plausibility;
this.recordedTimestamp = recordedTimestamp;
this.userId = userId;
}
public static read(buffer: VarValueCollection): SupplementValue {
return new SupplementValue(
buffer.getLong(),
buffer.getLong(),
buffer.getDouble(),
buffer.get(),
buffer.getLong(),
buffer.getInt()
);
}
public getType(): IVarValueType {
return IVarValueType.SupplementValue;
}
public getStartTimestamp(): number {
return this.startTimestamp;
}
public getEndTimestamp(): number {
return this.endTimestamp;
}
public getValue(): number {
return this.value;
}
public getQuality(): number {
return 1;
}
public getPlausibility(): Plausibility {
return this.plausibility;
}
public getRecordedTimestamp(): number {
return this.recordedTimestamp;
}
public getUserId(): number {
return this.userId;
}
public copyWithTimeShift(timeShift: number): IVarValue {
return new SupplementValue(this.startTimestamp + timeShift,
this.endTimestamp + timeShift, this.value, this.plausibility,
this.recordedTimestamp, this.userId);
}
public copyWithValueAndTimeShift(value: number, timeShift: number): IVarValue {
return new SupplementValue(this.startTimestamp + timeShift,
this.endTimestamp + timeShift, value, this.plausibility,
this.recordedTimestamp, this.userId);
}
getValueSource(): ValueSource {
return ValueSource.Supplement;
}
}
export class FastIVarValuesArrayIterable implements IterableIterator<IVarValue[]> {
private array: VarValueCollection[];
constructor(array: VarValueCollection[]) {
const size = array[0].getSize();
for (const c of array) {
if (c.getSize() !== size) {
throw new Error('Collections are not of same size!');
}
}
this.array = array;
}
[Symbol.iterator](): IterableIterator<IVarValue[]> {
return this;
}
public next(): IteratorResult<IVarValue[]> {
const hasNext = this.array[0].hasNext();
const valueArray: IVarValue[] = [];
for (const collection of this.array) {
valueArray.push(collection.next().value);
}
return {
done: !hasNext,
value: valueArray
};
}
}