👓
Lens AI
  • 👋Welcome to Lens AI
  • Overview
    • 💡Why Lens AI
    • ✨Features
    • 📖API Documentation Edge Observability
    • MedicalAI Observability Library API Reference
  • Product Guides
    • 📪Understanding the Metrics
    • 📎Sampling Metrics
    • 📎Image Metrics
    • 📎Model Metrics
  • Fundamentals
    • 🛠️Getting started
      • 🗒️Step 1: Integrating Lens AI Python Profiler
      • 📝Step 2: Integrating Lens AI Cpp Profiler on Edge
      • 🧑Step 3: Monitoring the Metrics on the Lens AI Server.
  • Use Cases
    • 🖥️Object Detection
  • Usage
Powered by GitBook
On this page
  • Memory Effciency
  • Fixed Memory Usage​
  • Scalability​
  • Computational Efficiency
  • Realtime Sampling
  1. Overview

Features

Key Feature of Lens AI

PreviousWhy Lens AINextAPI Documentation Edge Observability

Last updated 10 months ago

Memory Effciency

Fixed Memory Usage​

The data structures used by Lens AI operate with a fixed amount of memory, making them ideal for applications with memory constraints or long-running processes where memory usage must remain predictable and bounded.

Scalability​

Unlike classical histograms that might require more memory as more data is processed or as data complexity increases (e.g., higher resolution, multiple channels), these sketches maintain a consistent memory footprint, ensuring efficient scalability.

The space complexity is 𝑂(1/𝜖log(𝜖𝑁)) where as classical logging it is O(N).

Computational Efficiency

Lens AI leverages KLL datastructures that are designed for incremental updates, making them highly efficient for streaming data scenarios.

Amortized logarithmic insertion complexity ensures efficient handling of high-throughput data streams

Time complexity while insertion and query time better than classical logging

Get amazing things done with awesome feature two. But remember that awesome feature one and three exist too. In fact, Awesome Product is full of awesome features.

Realtime Sampling

Wide range of built-in techniques for sampling data where the model is most uncertain.

Reduce datatransfer cost significantly

Keep your model updated always with the latest data

✨