Events
Introduction
The major feature of Camlytics Service that makes it stand out from the crowd of video surveillance systems is it's ability to analyze video camera streams in real time, to generate different kinds of events, and send them to the cloud database for storage.
Event is a basic entity that allows Camlytics to build it's real time occupancy reports, charts, provide API access, and more.
All events are generated when tracked objects interact with triggers - line, zone or scene. Lines and zones are configured during the Camlytics Service calibration. You can add multiple lines or zones into the same camera scene.
The default time span (storage time) of events in the Cloud account is 3 months. If you want to store events for longer period, you need to purchase the additional storage units for each of your channels.
Detection profile
The first step in setting up Camlytics Service is choosing the appropriate detection profile. The profile defines what types of objects will be detected and how the counting is performed. Your choice directly impacts the accuracy and relevance of analytics results.

Recommended profiles
- AI vehicles & people – Use this when you need to detect both people and vehicles.
- AI people (overhead) – Best suited for overhead camera angles (top-down view) to count people.
- AI people (tilted) – Designed for cameras mounted at an angle, ideal for people counting.
- AI faces – Enables detection of gender and age through facial analysis.
Non-AI profiles
Overhead camera / Tilted camera (including high and low sensitivity options) – These are lightweight profiles that do not use neural networks. They are suitable for basic motion-based object counting, but they do not classify objects. For example, a person and a car will be treated the same. These profiles are less resource-intensive and ideal when you only need general object flow data, not detailed classification or advanced analytics.
If needed, you can also disable analytics completely for a channel by checking the “Channel events disabled” option. This can be useful if the camera is only used for streaming or recording, without analytics.
Additionally, you can adjust the detection threshold (human confidence level). Higher values reduce false positives but may result in missed detections.
We recommend testing several profiles with your camera setup to find the best balance of accuracy and performance.
Calibration (only for non-AI profiles)
Calibration is required only when using non-AI detection profiles. These profiles rely on traditional motion-based tracking, so accurate calibration is essential for reliable object detection, counting, and heatmaps.
To access calibration settings, go to Calibration in the channel menu.

In the Calibration section, you’ll see a ruler overlaid on a video snapshot. This ruler must be adjusted to match the real-world size of a typical object in your scene. Choose an average-sized object you want to track (usually a person), and scale the ruler to fit it accurately.
- For Overhead camera profiles, the ruler should match a person seen from above.
- If object sizes vary significantly in the scene, it's better to calibrate based on the smaller object.
- Proper calibration ensures that the green tracking boxes closely match the size of actual objects.
Examples of good calibration
- Green tracking boxes align with real object dimensions.
- Tracking is smooth and consistent.

Examples of bad calibration
- Ruler set too large or too small.
- Tracking boxes are oversized or tiny, leading to poor results and unstable analytics.


For Tilted camera profiles, it’s also critical to set the marker correctly, as it affects the system’s understanding of minimum and maximum object sizes for detection.

Once calibration is complete, you can proceed to define the Area of Interest, detection lines, and zones.
Area of Interest
The Area of Interest defines the region of the video where analytics will be active and objects will be detected. By default, it covers the entire camera view, but in most cases, it should be narrowed down.

- For AI profiles (e.g. AI people, AI vehicles & people, AI face), AOI helps reduce visual noise by limiting detection to relevant parts of the scene, improving accuracy and lowering processing load.
- For non-AI profiles (e.g. Overhead camera, Tilted camera), AOI is critical — it allows you to exclude unwanted motion, such as automatic doors, elevators, moving trees, or reflections. Without a properly configured AOI, detection and counting can be severely affected by false triggers.
You can adjust the area freely by adding or removing nodes via double-click on the AOI boundary.
Triggers (Zones, Lines, and SpeedLines)
Triggers are core elements that define what types of movement and activity the system should detect and respond to. They let you track object flow, count events, measure speed, and generate automated responses.
There are three types of triggers:
Lines

Lines are used to count objects crossing a virtual line in the video. You can configure:
- One-way or two-way detection
- Entry/exit logic
- Object type filters (e.g., only people or vehicles in AI profiles)
Lines are ideal for entrances, gates, hallways, or any directional flow monitoring.
You can also enable Tailgating detection by checking the Tailgating option for a line. This event triggers when two objects cross the same line within a short interval (typically under 1 second).
Tailgating detection is especially useful for access control scenarios, such as monitoring office entry points to detect when someone follows another person through a secured gate or door without proper authorization. It works best with overhead cameras and people-counting profiles.
Zones
Zones detect objects entering, moving, or dwelling in specific areas of the video frame.

Each zone has two built-in events — Zone joined and Motion started
— which are always active and do not require configuration.
In addition, zones support two optional and configurable event types:
Object dwell– triggers when an object stays inside the zone longer than a specified dwell time (in seconds)Crowd appear– triggers when a defined minimum number of objects are present in the zone for a set duration
These advanced triggers are useful for detecting loitering, queues, overcrowding, or unusual idle behavior within critical areas.
SpeedLines
SpeedLines allow for both speed measurement and trajectory-based counting.

- Measure how fast objects move between two lines
- Set thresholds to detect overly fast or slow movement
- Ideal for traffic monitoring, speeding alerts, or identifying abnormal pedestrian behavior
In addition to speed analytics, SpeedLines can also be used to count objects moving along specific paths.
For example:
- At an intersection, you can count vehicles turning right separately from those going straight across a perpendicular road
- Useful in traffic analysis, flow segmentation, or behavior mapping in complex environments
SpeedLines are powerful when you need to understand how fast and in which direction people or vehicles are moving.
All triggers can be layered and combined within a single scene. You can name them, export event data, connect them to APIs or webhooks, and use them for real-time alerts or historical analysis.
Event types
There are multiple event types that make the multitude of reports work. Every single event is generated by unique object that has the ID and classification details (Human/Vehicle/etc.).
You can find the event types table below.
| Name | Trigger | Description | Has object ID | Object classification |
|---|---|---|---|---|
| Line crossed | Line | Fired when a line is crossed by object of any kind. | Yes | Yes |
| Tailgating | Line | Fired when two objects cross the same line with small delay (up to 1 sec). Useful for access security monitoring. Mostly used with people counting. Read more in our use cases. | Yes | Yes |
| Zone joined | Zone | Fired when an object has joined the zone. | Yes | Yes |
| Motion started | Zone | Indicates of motion start in a zone. Triggered by object entering the zone.
If object just appeared, Zone joined and Motion started events will happen simultaneously. |
Yes | Yes |
| Object dwell | Zone | Fired when object that has been in a zone for long enough time (configurable during calibration). | Yes | Yes |
| Crowd appear | Zone | Fired when many enough objects have been in a zone for long enough time (configurable during calibration). | No | No |
| Camera obstructed | Scene | Indicates that the camera has been obstructed partly or completely by light, huge object, etc. or the camera has been shifted. Analogous to the "Sabotage" event. | No | No |
Events page
Events page allows you browsing all events that are stored on your Cloud account. You can filter by location, channel, trigger name, time, type, class (Vehicle/Human/etc.). When the filtered events are shown, you can export them into a .csv spreadsheet.
You can also get snapshot of each of the events. All snapshots are stored on a local machine with running Camlytics Service and are pulled from there upon the "Get snapshot" button click.