🧑Step 3: Monitoring the Metrics on the Lens AI Server.

Lens AI server is used for monitoring the data and model drift and access the sampled data.

Lens AI monitoring server Components

The Lens AI Server comprises five major components:

  1. Lens AI Sensor Data Handler: Manages sensor HTTP requests and stores incoming sensor data.

  2. Lens AI Workers: Aggregates, transforms, and extracts metric data from sensor inputs for analysis.

  3. Lens AI GraphQL Server: Provides a GraphQL endpoint to fetch the aggregated metrics, enabling flexible and efficient data retrieval.

  4. Lens AI Monitoring Server: Uses Grafana to build interactive dashboards for monitoring key metrics.

  5. Lens AI Database: Utilizes MongoDB to store sensor data and metrics for persistent and scalable storage.

Lens AI Usage

git clone https://github.com/lens-ai/lensai_server

Server Configuration

nano config.ini 

Build the docker containers

docker-compose up --build -d 

Lens AI server Configuration

Adjust the config file based on the proejct requirements mainly . Please don't change the DB collection names.

  • [DEFAULT]

  • PROJECT_ID = your project id

  • SLEEP_INTERVAL = 10 (Sleep interv of the workers)

  • NUM_WORKERS = 1 (Number of worker threads)

  • [paths]

  • BASE_PATH = /tmp (Base path is the path under which the data is mounted on the container)

Lens AI sensor data handler

The sensor data handler runs on port 8000 and can be accessed at http://localhost:8000 on the host machine. To change the host port, modify the docker-compose.yml file to your preferred port.

Lens AI Dashboard:

The Lens AI Dashboard is accessible on port 3000 on the host machine. Access it via http://localhost:3000.

Screenshots of the Dashboard

Last updated