io.prediction.data.storage.hbase
:: DeveloperApi :: Read from database and return the events.
:: DeveloperApi :: Read from database and return the events. The deprecation here is intended to engine developers only.
return events of this app ID
return events of this channel ID (default channel if it's None)
return events with eventTime >= startTime
return events with eventTime < untilTime
return events of this entityType
return events of this entityId
return events with any of these event names.
return events of this targetEntityType:
return events of this targetEntityId
Spark context
RDD[Event]
:: DeveloperApi :: Write events to database
:: DeveloperApi :: Write events to database
:: DeveloperApi :: Write events to database
RDD of Event
the app ID
Spark Context
Aggregate properties of entities based on these special events: $set, $unset, $delete events.
Aggregate properties of entities based on these special events: $set, $unset, $delete events. The deprecation here is intended to engine developers only.
use events of this app ID
use events of this channel ID (default channel if it's None)
aggregate properties of the entities of this entityType
use events with eventTime >= startTime
use events with eventTime < untilTime
only keep entities with these required properties defined
Spark context
RDD[(String, PropertyMap)] RDD of entityId and PropertyMap pair
(Since version 0.9.2) Use PEventStore.aggregateProperties() instead.
:: Experimental :: Extract EntityMap[A] from events for the entityType NOTE: it is local EntityMap[A]
:: Experimental :: Extract EntityMap[A] from events for the entityType NOTE: it is local EntityMap[A]
(Since version 0.9.2) Use PEventStore.aggregateProperties() instead.
(Since version 0.9.2) Use PEventStore.find() instead.