You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
sanke1 c4ea30d52f english fixes + touches (#14) 7 years ago
analytics 172-make-peaks-detector-work 7 years ago
server Update dist 7 years ago
.gitignore folders config++ 7 years ago
Dockerfile Update Dockerfile 7 years ago
HOOKS.md docs++ 7 years ago
LICENSE Initial commit 7 years ago
README.md english fixes + touches (#14) 7 years ago
REST.md Error message in GET /anomalies/status 7 years ago

README.md

Hastic server Travis CI

Website | Twitter

Implementation of basic pattern recognition for anomaly detection.

Implementation of analytics unit for Hastic.

See also:

  • Hooks - notifications about events
  • REST - for developing your plugins
  • HasticPanel - Hastic visualisation plugin for Grafana

Build & run

Hastic server requires Grafana's API key (http://<your_grafana_url>/org/apikeys) to query data from Grafana datasources. API key role requires only Viewer access.

Possible to install on:

Linux

Environment variables

It is possible to export the following environment variables for hastic-server to use:

  • HASTIC_API_KEY - (required) API-key of your Grafana instance
  • HASTIC_PORT - (optional) port you want to run server on, default: 8000

Dependencies

System prerequisites:

Installation

pip3 install pandas seglearn scipy tsfresh

git clone https://github.com/hastic/hastic-server.git
cd ./hastic-server/server
npm install 
npm run build

Run

export HASTIC_API_KEY=<your_grafana_api_key>
export HASTIC_PORT=<port_you_want_to_run_server_on>

cd ./hastic-server/server
npm start

Docker

Build

git clone https://github.com/hastic/hastic-server.git
cd hastic-server
docker build -t hastic-server .

Run

docker run -d --name hastic-server -p 80:8000 -e HASTIC_API_KEY=<your_grafana_api_key> hastic-server

Known bugs & issues

  • Adding labeled segments while learning is in progress is not supported
  • Dataset doesn't update after 1st learning
  • Currently only influxDB datasource is supported