Settings on camera🔗

Despite the fact that Eocortex can work with one or multiple streams from cameras of any resolution, it should be borne in mind that the load on the Eocortex server largely depends on the parameters of the streams received from the cameras. It is also important to consider the specifics of setting up and functioning of Eocortex applications.

For most cameras, the parameters of the broadcast video and audio streams should be set on the camera itself, using the camera's web interface. An exception is made for certain brands and models of cameras, for which the parameters of video streams can be set in the Eocortex Configurator application.

The list of terms that can be found in camera settings can be found below.

Stream

Note

Possible names of the option: Stream, Channel

Stream means the video data transmission channel from the camera.

Eocortex supports receiving up to four streams from one source at the same time, in the camera settings these streams can be identified as:

  • First, Main

  • Second, Sub, Additional, Additional 1, Alternative, Alternative 1

  • Third, Additional 2, Alternative 2

  • Fourth, Additional 3, Alternative 3

For most devices, the Stream option is used to enable and select a channel to configure other broadcast settings.

Codec

Note

Possible names of the option: Codec, Format, Video format, Compression format, Coding format, Encoding mode

Codec means a compression and decoding algorithm used to reduce the amount of data to be transmitted and stored.

Eocortex supports the following codecs:

  • MJPEG (Motion JPEG) — the standard with the lowest compression rate among the supported ones. The stream is composed of a series of JPEG images containing the full frames of the scene. Compared to other standards, it has the lowest demands on the resources of the decoding device, while at the same time requiring the largest channel width and disk space for storing the archive.

  • MPEG-4 (MPEG-4 Part 2) — the standard that provides a moderate compression rate, i.e. for the same image quality the channel width and archive size will be much smaller than that of MJPEG, while the resource consumption during decoding will increase insignificantly. The stream is composed of the reference frames (I-frames), containing full image of the scene, separated by a sequence of intermediate frames (P- and B-frames), containing only the moving part of the frame and object motion compensation data for predicting future frames. The prediction method causes moving objects to be visually blurred when viewed in slow motion and on pause. Modern cameras rarely use MPEG-4 — instead, they use the more advanced H.264.

  • H.264 (MPEG-4 Part 10, AVC) — the standard that can be called as a successor of MPEG-4. With improved stream conditioning and next-frame prediction algorithms, it provides higher compression ratio and better quality of intermediate frames than its predecessor. This reduces the amount of transmitted and stored data of the same image quality by increasing resource consumption during processing compared to MPEG-4.

  • H.265 (HEVC) — the standard that the standard that can be called as a successor of H.264. Because of its more efficient encoding algorithms compared to its predecessor, it provides an even greater reduction in the amount of transmitted and stored data at the cost of increased resource consumption during data processing compared to H.264.

  • MxPEG — proprietary standard developed and applied by Mobotix. It is a sort of combination of MJPEG and H.264 standards: each frame of the stream is an independent frame, but only part of the frames contains the full image of the scene, while most of the frames contain only the changed fragments. This approach reduces motion blur on fragmented frames, while the resource requirements for transmission, storage and stream processing are in between of MJPEG and H.264.

  • H.264+, H.265+, WiseStream, Zipstream, Smart Stream — standards that are enhanced versions of H.264 and H.265 with improvements to various algorithms of the original codecs. Most often presented in the camera interface as an option of the original codec, rather than a standalone codec. They allow to reduce the bitrate without a noticeable loss of quality, reducing the load on the network and decreasing the size of the archive. The higher compression rate compared to the original codecs also implies higher resource requirements for decoding.

Note

Requirements and limitations of extended codec versions usage:

Note

Limitations have an impact when the GOV is greater than 350

  • Periodic lags may occur when viewing the archive.

  • When the number of cameras with a high GOV value is large, a lot of video data is kept in RAM. This can lead to high CPU resource consumption.

  • Lags may occur when positioning in the archive.

  • When viewing live video, the Waiting message before playback starts may be displayed longer than usual. This happens when the Connect via server option is enabled in the camera settings. If this option is disabled, the Eocortex Client application connects directly to the camera and the video is displayed almost without delay.

  • The higher the GOV value and the lower the frame rate, the more the archive is being decimated. In some cases, applying the second decimation stage is pointless.

  • When playing back a camera archive recorded with a high GOV value, it is necessary to have the Use advanced codec version option enabled in the additional stream settings even if the camera is currently set to a low GOV.

Note

The higher compression of the video, the smaller the amount of data to be transmitted and stored, but the greater the resource consumption for video decoding when playing back and processing with video analytics modules.

Compression

Note

Possible names of the option: Compression, Compression level, Quality, Quality level

All supported video formats provide the ability to adjust the balance between the quality of the frame and its bit rate. This setting can be represented in a camera settings as one of the following options:

  • Compression rate — the parameter that determines the degree of detail reduction of the frame.

  • Quality level — the opposite parameter to the compression rate, determining the remaining details of the frame.

The parameter values can be specified as a percentage of the source image (e.g., Quality=90%) or as levels predefined by the camera manufacturer (e.g., in the range from 0 to 12).

Both parameters are responsible for reducing the detail of the frame, the difference is in the method of applying the selected value: while the Compression rate indicates how much the detail is to be reduced, Quality level indicates how much the detail is to be preserved. For example, setting Compression=30% is the same as setting Quality=70%.

Depending on video surveillance conditions and camera hardware features, an acceptable compression level can be from 30 to 60%, although options from 0 (minimum compression) to 70% (high compression) are possible. The real compression rate should be set based on the visual evaluation of the resulting image quality.

Resolution

Note

Possible names of the option: Resolution, Image size, Image quality

Resolution — the parameter that determines the detail of the frame by specifying the number of color dots (pixels) that form the image. Depending on the camera model, this setting can be expressed as a direct comparison of the number of dots in the horizontal and vertical direction (1920x1080), the absolute number of dots in the image (2 Mpix) or as a standard name (FullHD).

The higher the resolution, the greater the detail in the frame and the higher the resource consumption in transmitting, storing and processing it.

Note

Eocortex has no limitations on this parameter except those imposed by the standard itself. However, a low resolution can make it impossible for video analytics modules to analyze the frame, while a high resolution may result in increased resource consumption when decoding an overly detailed frame.

Frame rate

Note

Possible names of the option: Frame Rate, Frame frequency, Frames per second, FPS

Frame Rate — the parameter indicating the number of frames per one second of video.

The higher the value of this parameter, the smoother the motion in the video, but the higher the resource consumption in transmission, storage, and processing.

Note

Eocortex has no limitations on this parameter except those imposed by the standard itself. However, a low frame rate can cause lags when playing video and make it impossible for video analytics modules to analyze the stream.

Profile

Note

This option is mostly available only for the H.264 codec

Note

Possible names of the option: Profile

Profile — a set of settings and restrictions in the codec algorithms application, providing compatibility of stream compression and decoding on different devices. The selected profile also determines the initial video compression rate, which determines resource consumption during transmission and decoding.

Eocortex supports the following profiles:

  • Baseline (BP) — profile that assumes minimum video compression. Compared to other profiles, it provides the lowest resource consumption during decoding at the expense of increasing the amount of data for transmission and storage.

  • Main (MP) — profile that assumes a balance between resource consumption in decoding, transmitting and storing data.

  • High (HP) — profile that assumes maximum video compression. Compared to other profiles, it provides the lowest load on the network and storage disks at the expense of increased resource consumption during decoding.

Bitrate

Note

This option is mostly available only for the H.264 codec

Note

Possible names of the option: Bitrate

Bitrate — the parameter that defines the amount of information transmitted by the camera per unit of time. Measured in bits per second (bits/c, bps), as well as derived values with kilo- (kbit/s, kbps), mega- (Mbit/s, Mbps), and similar prefixes.

Note

Eocortex has no limitations on this parameter except those imposed by the standard itself. However, a low bitrate value can cause a significant degradation of image quality, especially in a high activity scene.

Bitrate type

Note

This option is mostly available only for the H.264 codec

Note

Possible names of the option: Bitrate type

Bitrate Type — the parameter that determines how to control the amount of data transmitted.

  • Variable Bit Rate (VBR) — type of bitrate with a dynamic amount of transmitted data. It provides the specified parameters of the video stream, while the used channel width can vary. This mode is recommended to use in most cases, if there are no problems with network bandwidth.

  • Constant Bit Rate (CBR) — type of bitrate with a static amount of transmitted data. It provides a specified channel width, at the same time, depending on the implementation in the given camera model, some parameters of the video stream can be changed and, as a result, the quality of the image may be reduced. This mode is recommended for use only when there are problems with network bandwidth.

GOV

Note

This option is mostly available only for the H.264 codec

Note

Possible names of the option: GOP, GOV, Group of Pictures, Group of Video, Group of VOP, I-frame

GOV — the parameter that defines the length of the group of frames (the distance between the reference frames).

For example, with a GOV of 50, there will be one reference frame for every 50 frames transmitted (i.e., there will be 49 intermediate frames between the nearest reference frames); with a GOV of 25fps and a GOV of 50, there will be one reference frame every 2 seconds. The higher the GOV, the smaller the video stream bitrate, but the greater the memory and CPU consumption. The reason is that to decode each subsequent frame, the reference frame and all subsequent intermediate frames must be stored in memory until the next reference frame.

Note

Eocortex has no limitations on this parameter except those imposed by the standard itself. However, a high GOV value may cause video playback lags and inability to analyze the stream by analytics modules, while a low GOV value will increase the load on the network and the archive. For example, with a GOV of 1, H.264 is no different than MJPEG.