# Developers Guide

## UserAgent <a href="#user-agent" id="user-agent"></a>

The [`UserAgent`](https://apirtc.github.io/references/apirtc-js/UserAgent.html) is an entry point to ApiRTC CPaaS. This is the first object to instantiate when implementing a front-end application. It represents the local user that will participate in the conversation.

The UserAgent can be either anonymous or identified.

Identification is done through a JWT retrieved from an authentication service.

* Read the [authentication](https://dev.apirtc.com/apirtc-developer-portal/apirtc-js-library/authentication "mention") page for more details on how to authenticate.
* Read the [`UserAgent` reference page](https://apirtc.github.io/references/apirtc-js/UserAgent.html).

### MediaDevices

`UserAgent`'s `mediaDeviceChanged` event can be listened to in order to get notified of the list of devices available to the browser:

Here is what the `mediaDevices` object looks like:

```javascript
userAgent.on("mediaDeviceChanged", () => {
  const mediaDevices = this.userAgent.getUserMediaDevices();
  // handle new set of mediaDevices
});
```

This is useful to propose a list of available media devices to the user.

## Session

A [`Session`](https://apirtc.github.io/references/apirtc-js/Session.html) instance represents a connection to the ApiRTC CPaaS. A `Session` is configured by an API key and a declared `UserAgent`.

A session handles all the interactions of participants, including video/audio streams and data exchanges for one `Enterprise` identified by its API key.

A `Session` object is to get through the [`UserAgent.register(options)`](https://apirtc.github.io/references/apirtc-js/UserAgent.html#register) method. Some `options` (of type [`RegisterInformation`](https://apirtc.github.io/references/apirtc-js/global.html#RegisterInformation)) controls the authentication mechanisms.

* Read the [authentication](https://dev.apirtc.com/apirtc-developer-portal/apirtc-js-library/authentication "mention") page for more details on how to authenticate.
* See the [Session reference page](https://apirtc.github.io/references/apirtc-js/Session.html).

### Contacts

Contacts are the participants to a `Session`. They can be authenticated or anonymous.

```javascript
session.getContacts().foreach({username, contactObject} => {
    console.log(contactObject.getId());
});
```

* See the [`Contact` reference page](https://apirtc.github.io/references/apirtc-js/Contact.html)

### Data exchange

Data can be exchanged across Contacts by using the [`Contact.sendData(object)` ](https://apirtc.github.io/references/apirtc-js/Contact.html#sendData__anchor)method:

```javascript
contact.sendData({aProperty:'aValue'})
  .then(() => {
    console.log("message sent")
  }).catch((error: any) => {
    console.error("sendData", error)
  })
```

To receive the data, listen on the Session's [`contactData`](https://apirtc.github.io/references/apirtc-js/Session.html#event:contactData) event:

```javascript
session.on('contactData', contactDataEvent => {
  console.log('received data from sender', contactDataEvent.sender, contactDataEvent.content)
})
```

### Presence group

Each Contact getting into a Session can join presence groups and segment all the connected users into subcategories.

For example: an employee can get into a Session, and join the "Operator" and "Available" groups, while a customer will join the "Customer" group.

To make a user connect within some group, set the `RegisterInformation.groups` in the `UserAgent.register(options)` options, or user the `Session.joinGroup`method.

Joining a group as a participant will activate session's `contactListUpdate` event listening on this group. Alternatively, you can subscribe to the group's events without joining it with the `Session.subscribeToGroup` method.

{% hint style="info" %}
If the current participant doesn't subscribe to or join a group, they will not receive event regarding group changes.
{% endhint %}

The data object associated to `Session.contactListUpdate` event has `joinedGroup` and `leftGroup` properties to carry information on which `Contact` joined of left which group:

```javascript
session.on('contactListUpdate', (updatedContacts: any) => {
  for (const group of Object.keys(updatedContacts.joinedGroup)) {
    for (const contact of updatedContacts.joinedGroup[group]) {
      // ...
    }
  }
  for (const group of Object.keys(updatedContacts.leftGroup)) {
    for (const contact of updatedContacts.leftGroup[group]) {
      // ...
    }
  }
})
```

### UserData

`UserData` is a class that holds a data object to store values associated to a `UserAgent`. Make sure to call `UserData.setProp(key, value)` to set up a property.

Once connected to `Session`, call `userData.setToSession()` to make other connected peers notified of `UserData` properties change through the `Session.contactListUpdate` event.

For that purpose, the data object associated to `Session.contactListUpdate` event has a `userDataChanged` property which is an array of `Contact`s for which `UserData` has changed.

```java
session.on('contactListUpdate', updatedContacts => {
  for (const contact of updatedContacts.userDataChanged) {
    // ...
  }
}
```

## Conversation <a href="#conversation" id="conversation"></a>

A conversation is the way to gather participants to exchange medias. It can be text message, audio/video streams, files...

Whenever there are 2 participants or more, a conversation takes place.

### getOrCreateConversation

To get a [`Conversation`](https://apirtc.github.io/references/apirtc-js/Conversation.html) instance, the Session's method [`getOrCreateConversation(name, options)`](https://apirtc.github.io/references/apirtc-js/Session.html#getOrCreateConversation__anchor) should be used.

The `name` is of string type without any constraint.

```javascript
conversation = session.getOrCreateConversation(name, {
  meshModeEnabled: false,
  meshOnlyEnabled: false,
  moderationEnabled: false,
  moderator: false
});
```

#### Options

| key                 | description                                                                                                                                                        |
| ------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `meshModeEnabled`   | enables the mesh mode (otherwise SFU star topology is used by default). (default: `false`)                                                                         |
| `meshOnlyEnabled`   | forces the mesh mode for the whole Conversation.                                                                                                                   |
| `moderationEnabled` | enables moderation on the `Conversation`. This option may change the behavior of the joining process depending on the `moderator` option value. (default: `false`) |
| `moderator`         | adds the local user to the list of moderators for the conversation. (default: `false`)                                                                             |

{% hint style="info" %}
Every participant must enable the `moderationEnabled` option to have consistent moderation apply throughout the Conversation.
{% endhint %}

#### Precision on the Mesh Mode

The mesh mode enables a peer-to-peer connection across participants, without going through a stream routing server (called SFU for Selective Forwarding Unit).

{% hint style="info" %}
Mesh mode multiplies the stream sent by each participant. As upload bandwidth is often lower than download bandwith, network connection can become shaky as the number of participants grows.
{% endhint %}

If `meshModeEnabled` is `true` when setting the conversation mode, the stream exchanges will remain in P2P until:

* the number of participant goes over 4,
* or too many packet loss is detected for one participant.

Then the conversation will automatically switch to star topology mode using the ApiRTC SFUs infrastructure.

Setting both `meshModeEnabled` and `meshOnlyEnabled` to `true` forces the conversation to remain mesh only, whatever the connection's events.

### Join Conversation

`Conversation.join()` makes the local user enter the conversation. Note that this method returns a `Promise` and one must wait for it to be fulfilled before doing anything else on the conversation.

A good practice is to register all required `Conversation` event listeners before calling the `join` method:

```javascript
conversation.on('streamListChanged', streamInfo => {
  // Handle the 'streamListChanged' event...
});
// and any other relevant events
// ...

conversation.join()
  .then(() => {
    // local user successfully joined the conversation.
  }).catch(error => {
    // error
  });
```

### Leave Conversation

`Conversation.leave()` makes the local user leave the conversation. All ongoing streams will be automatically closed.

A good practice is to destroy the `Conversation` after leaving it. (Except if you want to be able to join it again afterward)

```javascript
conversation.leave()
  .then(() => {
    conversation.destroy();
  });
```

### Conversation Moderation

Moderation allows a group of users (moderators) to control the conversation's access to other participants.

#### Activation

In order to activate the moderation for a conversation, every party (moderator or not) must explicitly set the `moderationEnabled` option to `true` when calling [getOrCreateConversation](https://apirtc.github.io/references/apirtc-js/Session.html#getOrCreateConversation__anchor).

Additionally, the moderator participant set the `moderator` option to `true` as well when calling [getOrCreateConversation](https://apirtc.github.io/references/apirtc-js/Session.html#getOrCreateConversation__anchor).

#### Joining process

{% @mermaid/diagram content="graph TD
ModeratorInterface\[Moderator Interface] -->  |getOrCreateconversationwith moderationEnabled': true & moderator: true| CreateConv\[Create a Conversation] --> |join| ModeratorJoined\[Moderator joins the conversation immediately]
ParticipantInterface\[Guest Interface] --> |getOrCreateconversation with moderationEnabled': true & moderator: false| GuestReady\[Create a Conversation]
GuestReady --> |join| GuestWait\[Guest joined the waiting room]
ModeratorJoined --> ModeratorSignaled{Moderator receive event}
ModeratorSignaled --> |Accept| ModeratorGuest\[Guest join the conversation]
ModeratorSignaled --> |Decline| GuestLeave\[Guest Leave the Conference]
GuestWait --> ModeratorSignaled
ModeratorGuest --> |Moderator Eject Guest| GuestLeave\[Guest is out of the conversation]
ModeratorGuest --> |Guest Leave| GuestLeave\[Guest is out of the conference]
ModeratorGuest --> |Moderator Leave| ModeratorLeave\[Moderator is out of the conference]" %}

If the local user is moderator, then the `join()` will resolve immediately.

But if the local user is not moderator, then the `join()` will only resolve when a moderator allows it. In the meanwhile, the user will be put in a waiting room.

### Waiting Room

The waiting room is a [presence group](#presence-group) associated to the conversation. It allows to identify participants who are currently waiting for a moderation answer.

Events `contactJoinedWaitingRoom` and `contactLeftWaitingRoom` will be triggered respectively upon the arrival and departure of a user to/from the waiting room:

```javascript
conversation.on('contactJoinedWaitingRoom', contact => {
  // A candidate joined the waiting room.
  // Store it into a list and display it in the DOM
  // ...
});

conversation.on('contactLeftWaitingRoom', contact => {
  // A candidate left the waiting room.
  // Remove from list
  // ...
});
```

Then the moderator can allow or deny a contact to enter the conversation:

```javascript
// Grant...
conversation.allowEntry(contact);
// ... or deny access.
conversation.denyEntry(contact);
```

### Eject

Moderators have the ability to eject another participant from the conversation.

```javascript
conversation.eject(contact, { reason: 'a reason' })
  .then(() => {
    console.log('ejected', contact);
  }).catch((error) => {
    console.error('eject error', error);
  });
```

To get notified of a participant ejection, listen on the `participantEjected` event. The event data object wears a `self` boolean set to `true` if the current local user is the ejected participant.

```javascript
conversation.on('participantEjected', data => {
  console.log('on:participantEjected', data);
  if (data.self) {
    // local user was ejected,
    // unpublish streams,
    // and destroy the conversation
  }
});
```

### Record the conversation

ApiRTC platform allows to record a composite video of a conversation. The video will be composed of all streams exchanged in the conversation and will be stored in ApiRTC's database.

To start recording a conversation:

```javascript
conversation.startRecording().then(recordingInfo => {
  console.info('startRecording', recordingInfo);
}).catch((error: any) => {
  console.error('startRecording', error);
});
```

Refer to [`Conversation.startRecording(options)`](https://apirtc.github.io/references/apirtc-js/Conversation.html#startRecording__anchor) for details on what are the options.

Example of `recordingInfo` ([`RecordingInfo`](https://apirtc.github.io/references/apirtc-js/global.html#RecordingInfo)) data:

```json
{
    "roomName": "Test",
    "callId": "COMPOSITE",
    "recordType": "composite",
    "convId": "2b0839f5-aa1e-4cb2-ba9a-46848a6b",
    "mediaId": "1261785",
    "mediaURL": "https://dashboard.apirtc.com/media/showVideo/<id>/hash/2c625610-4baa-11ec-a192-538513dee1ef",
    "recordedFileName": "vfrP9vWu-3467-composite.mp4",
    "audioOnly": false,
    "videoOnly": false,
    "mode": "complete",
    "labelEnabled": false
  }
```

To stop recording a conversation:

```javascript
conversation.stopRecording().then(recordingInfo => {
  console.info('stopRecording', recordingInfo);
}).catch((error: any) => {
  console.error('stopRecording', error);
});
```

Once the recording is stopped, the ApiRTC platform will process it and make it available for download. To get notified when a record is available, listen to the `recordAvailable` event of the Converation's instance:

```javascript
conversation.on('recordingAvailable', recordingInfo => {
  console.log("on:recordingAvailable", recordingInfo);
  ...
});
```

When the video is available, you can use the `RecordingInfo.mediaURL` to download it.

### Speaker detection

To display which participant is currently talking in a `Conversation`, enable the feature:

```javascript
userAgent.enableActiveSpeakerDetecting(true, { threshold: 50 });
```

Then, listen on the `audioAmplitude` event:

```javascript
conversation.on('audioAmplitude', amplitudeInfo => {
  // handle amplitudeInfo
})
```

the event data object (`amplitudeInfo`) holds the following information:

```json
{
  "streamId": "6725958108801516",
  "amplitude": 102.36,
  "isSpeaking": true
}
```

When the participant speaks and `amplitude` goes over the `threshold` configured during feature enabling, event with `isSpeaking` set to `true` is fired.

When the participant does not speak anymore, the event is fired again with initial `amplitude` value that triggered the event, but this time `isSpeaking` is false.

### QoS statistics

`Conversation` event `callStatsUpdate` provides statistics information on media stream exchanges quality of service.

```javascript
conversation.on('callStatsUpdate', callStats => {
  // handle callStats.stats data
});
```

Depending on whether the stream is sent or received, the event data object (`callStats`) holds the following information:

For a local stream, qos info on sent media

```json
    "streamId": "7167592935479248",
    "stats": {
        "audioSent": {
            "bitsSentPerSecond": 22044,
            "bytesSent": 54006,
            "delay": 0,
            "kind": "audio",
            "mediaType": "audio",
            "nackCount": 0,
            "packetLossRatio": 0,
            "packetsSent": 982,
            "packetsSentPerSecond": 50,
            "remoteId": "9385e3a0",
            "samplingInterval": 10,
            "timestamp": 1633005140,
            "type": "outbound-rtp"
        },
        "videoSent": {
            "bitrateMean": 490785.10526315786,
            "bitrateStdDev": 54128.26265341604,
            "bitsSentPerSecond": 517595,
            "bytesSent": 1245545,
            "delay": 0,
            "droppedFrames": 2,
            "firCount": 0,
            "framerateMean": 30.315789473684212,
            "framerateStdDev": 0.749268649265355,
            "framesEncoded": 562,
            "framesEncodedPerSecond": 30,
            "height": 480,
            "kind": "video",
            "mediaType": "video",
            "moyDelay": null,
            "nackCount": 2,
            "packetLossRatio": 0,
            "packetsSent": 1232,
            "packetsSentPerSecond": 63,
            "pliCount": 4,
            "qpSum": 20935,
            "remoteId": "526431ec",
            "samplingInterval": 10,
            "timestamp": 1633005140,
            "type": "outbound-rtp",
            "width": 640
        },
        "quality": {
            "mosS": "NoStream",
            "mosSAV": 3.087473118525441,
            "mosSS": 4.409150284259602,
            "mosSV": 3.4956463628881274,
            "mosV": "NoStream"
        }
    }
}
```

For a remote stream, qos info on received media

```json
{
    "streamId": "362307064506733",
    "stats": {
        "audioReceived": {
            "bitsReceivedPerSecond": 22044,
            "bytesReceived": 109505,
            "delay": 0,
            "jitter": 0.002,
            "kind": "audio",
            "mediaType": "audio",
            "nackCount": 0,
            "packetLossRatio": 0,
            "packetsLost": 1,
            "packetsLostPerSecond": 0,
            "packetsReceived": 1991,
            "packetsReceivedPerSecond": 50,
            "remoteId": "f14eaf8",
            "samplingInterval": 20,
            "timestamp": 1633005174,
            "type": "inbound-rtp"
        },
        "videoReceived": {
            "bitrateMean": 773107.5384615384,
            "bitrateStdDev": 176425.8098193486,
            "bitsReceivedPerSecond": 910830,
            "bytesReceived": 3874287,
            "delay": 0,
            "discardedPackets": 0,
            "firCount": 0,
            "framerateMean": 30.076923076923073,
            "framerateStdDev": 0.4220635637221745,
            "framesDecoded": 1167,
            "framesDecodedPerSecond": 30,
            "height": 480,
            "jitter": 0.009,
            "kind": "video",
            "mediaType": "video",
            "nackCount": 13,
            "packetLossRatio": 0.09086778736937756,
            "packetsLost": 3,
            "packetsLostPerSecond": 0,
            "packetsReceived": 3790,
            "packetsReceivedPerSecond": 110,
            "pliCount": 1,
            "remoteId": "8a312f14",
            "samplingInterval": 20,
            "timestamp": 1633005174,
            "type": "inbound-rtp",
            "width": 640
        },
        "quality": {
            "mosAV": 3.4344075003680574,
            "mosS": 4.409150284259602,
            "mosSS": "NoStream",
            "mosSV": "NoStream",
            "mosV": 3.860635735132783
        }
    }
}
```

The `callStats.streamId` is useful to associate data to corresponding streamreams

{% hint style="info" %}

* See the [`Stream` reference page](https://apirtc.github.io/references/apirtc-js/Stream.html) for more information.
  {% endhint %}

## Stream

### Local Streams

#### Camera

Acquiring camera local stream is done through the `UserAgent.createStream(options)` method. The browser asks the user to choose among available media devices. The `Promise` resolves with a [`Stream`](https://apirtc.github.io/references/apirtc-js/Stream.html) instance.

```javascript
userAgent.createStream({
  constraints: {
    audio: true,
    video: true
  }
}).then(localStream => {
  // ...
}).catch(error => {
  // error
});
```

All possible options for the CreateStream method can be found in the [`CreateStreamOptions` reference page](https://apirtc.github.io/references/apirtc-js/global.html#CreateStreamOptions).

`constraints` option is of type [MediaStreamConstraints](https://www.w3.org/TR/mediacapture-streams/#dom-mediastreamconstraints). See the [Stream Constraint section](#stream-constraints) for more infos.

#### Screen sharing

Acquiring screen sharing local stream is done through a `Stream` static method:

```javascript
// Returns a Promise.<Stream> containing the stream
Stream.createScreensharingStream().then(localStream => {
// ...
}).catch(error => {
  // error
});
```

#### Publish/Unpublish

Publishing a local stream makes it available for remote peer participants to subscribe and view it.

The local user (`UserAgent`) must have joined the conversation before publishing a stream.

```javascript
conversation.publish(localStream).then(publishedStream => {
  // local stream is published
}).catch(error => {
  // error
});
```

`Conversation.publish(localStream, options)` can optionally take `PublishOptions` second parameter object to control publication. Please check reference for details on [PublishOptions](https://apirtc.github.io/references/apirtc-js/global.html#PublishOptions).

Unpublishing a local stream makes it unavailable for remote peer participants to subscribe and stops sending media stream to peers.

```javascript
conversation.unpublish(localStream);
```

### Remote Streams

#### Handle remote streams availability

ApiRTC triggers an event when stream availability changes through the `Conversation.streamListChanged` event.

This event is triggered:

* once for each existing stream when the participant joins the Conversation
* every time a new stream is published to or unpublished from the Conversation

{% hint style="warning" %}
The data object carried by Conversation.streamListChanged event is StreamInfo: this is not a Stream yet.
{% endhint %}

```javascript
conversation.on('streamListChanged', streamInfo => {
  const streamId = String(streamInfo.streamId);
  const contactId = String(streamInfo.contact.getId());
  if (streamInfo.isRemote === true) {
    if (streamInfo.listEventType === 'added') {
      // a remote stream was published
      ...
    } else if (streamInfo.listEventType === 'removed') {
      // a remote stream is not published anymore
      ...
    }
  }
});
```

The `streamInfo.contact.getId()` and `streamInfo.streamId` can be useful to identify which remote peer published their stream.

#### Subscribe to a remote stream

A remote stream is subscribed to using the `Conversation.subscribeToStream(streamId)` method. It takes the id of stream provided in the `StreamInfo` data object:

```javascript
conversation.subscribeToStream(streamInfo.streamId);
```

{% hint style="info" %}
Be mindful that whithout subscribing to stream's event, you will not be notified of streams updates and termination.
{% endhint %}

`Conversation.subscribeToStream(streamId, options)` can take optionally take [`SubscribeOptions`](https://apirtc.github.io/references/apirtc-js/global.html#SubscribeOptions) as a second parameter to control subscription.

#### Unsubscribe from a remote stream

Unsubscribing to a remote stream is done by the `Conversation.unsubscribeToStream(streamId)` method.

```javascript
conversation.unsubscribeToStream(streamId);
```

### Manage media streams

Once a stream has been subscribed, ApiRTC notifies with an actual `Stream` instance through the `streamAdded` event.

This event is triggered every time the actual media stream is available to be displayed.

```javascript
conversation.on('streamAdded', remoteStream => {
  // display media stream
  ...
});
```

Whenever a `media stream` becomes unavailable, ApiRTC notifies the conversation with a `streamRemoved` event.

```javascript
conversation.on('streamRemoved', remoteStream => {
  // undisplay media stream
  ...
});
```

{% hint style="warning" %}
A `media stream` may encounter technical issues, or meet network optimization requiring to change the actual `Stream` instance. In this case `streamRemoved` event will be also fired, prior to another `streamAdded` event with the new instance.
{% endhint %}

### Stream display

In order to display or remove media element in DOM, you can use our helpers:

* `Stream.addInDiv()` and `Stream.removeFromDiv()` to add/remove a `<video>` element within an existing `<div>`
* `Stream.attachToElement(domElement)` to directly attach to a `<video>` element.

We manage some devices specificities in our helpers that can help avoid media plays issue. (for instance with Safari iOS).

```javascript
// display media stream by attaching to a media element (like <video>)
stream.attachToElement(videoDomElement)
// or insert into a container div
stream.addInDiv('container-id', 'media-element-' + stream.streamId, {}, false)
```

### Audio/Video Mute

To control local or remote stream audio/video mute, use the following `Stream` methods:

{% tabs %}
{% tab title="Starting from apiRTC 5.0.1" %}
Evolution has been done on apiRTC 5.0.1 version to reflect [standard](https://www.w3.org/TR/mediacapture-streams/#dom-mediastreamtrack-enabled).

Mute state is managed by enabled/disabled attribute at application level.

```javascript
// toggle audio
if (stream.isAudioEnabled()) {
  stream.disableAudio();
} else {
  stream.enableAudio();
}

// toggle video
if (stream.isVideoEnabled()) {
  stream.disableVideo();
} else {
  stream.enableVideo();
}
```

{% endtab %}

{% tab title="Older versions" %}

```javascript
// toggle audio
if (stream.isAudioMuted()) {
  stream.unmuteAudio();
} else {
  stream.muteAudio();
}

// toggle video
if (stream.isVideoMuted()) {
  stream.unmuteVideo();
} else {
  stream.muteVideo();
}
```

{% endtab %}
{% endtabs %}

### Stream constraints

Constraints are camera properties that can be set: resolution, brightness, contrast, frameRate, saturation, torch, zoom.

Capabilities are supported properties and value ranges. Settings are the current properties values.

ApiRTC allows to access constraints, capabilities, settings on both local and remote streams, using the same methods. This means you can easily control both local and remote devices.

`Stream.applyConstraints(constraints)` method returns a Promise resolved when all constraints are applied:

```javascript
stream.applyConstraints({
  "audio": {},
  "video": {
    "frameRate": 10
  }
}).then(() => {
  ... // constraints applied
});
```

{% hint style="info" %}
Note that the `constraints is of type` [`MediaStreamConstraints`](https://www.w3.org/TR/mediacapture-streams/#dom-mediastreamconstraints).
{% endhint %}

`Stream.getConstraints()` returns a Promise with all properties that were modified and their current values:

```javascript
// get stream constraints that were applied and their values
stream.getConstraints()
  .then(constraints => {
    console.log(constraints) // constraints object
  }).catch((error) => {
    ... // error during process
  });
```

{% hint style="warning" %}
Constraints values depend on the device capabilities. For example, on smartphones with multiple back cameras, sometimes the torch property is only attached to one of the camera.
{% endhint %}

In addition, supported properties can be queried using `Stream.getCapabilities()` that returns a Promise with accepted values ranges:

```javascript
// get stream capabilities values ranges
stream.getCapabilities()
  .then(capabilities => {
    console.log(capabilities) // capabilities object
  }).catch((error) => {
    ... // error during process
  });

```

Example of a `capabilities` data object:

```json
{
  "audio": {
    "autoGainControl": [ true, false ],
    "channelCount": { "max": 1, "min": 1 },
    "deviceId": "...",
    "echoCancellation": [ true, false ],
    "groupId": "...",
    "latency": { "max": 0.085333, "min": 0.002666 },
    "noiseSuppression": [ true, false ],
    "sampleRate": { "max": 48000, "min": 48000 },
    "sampleSize": { "max": 16, "min": 16 }
  },
  "video": {
    "aspectRatio": { "max": 1920, "min": 0.001388888888888889 },
    "deviceId": "...",
    "frameRate": { "max": 30, "min": 0 },
    "groupId": "...",
    "height": { "max": 1080, "min": 1 },
    "resizeMode": ["none", "crop-and-scale"],
    "width": { "max": 1920, "min": 1 }
  }
}
```

In this example `video.frameRate` property may be set between 0 and 30.

{% hint style="warning" %}
`getCapabilities()` may not work with all browsers. Also, returned capabilities may differ from a device to another.
{% endhint %}

Finally, properties values can be checked with `Stream.getSettings()` that returns in a Promise all current settings:

```javascript
// get stream actual constraints settings
stream.getSettings()
  .then(settings => {
    console.log(settings) // settings object
  }).catch((error) => {
    ... // error during process
  });

```

Example of a `settings` data object:

```json
{
  "audio": {
    "autoGainControl": true,
    "channelCount": 1,
    "deviceId": "...",
    "echoCancellation": true,
    "groupId": "...",
    "latency": 0.01,
    "noiseSuppression": true,
    "sampleRate": 48000,
    "sampleSize": 16
  },
  "video": {
    "aspectRatio": 1.333333333333,
    "deviceId": "...",
    "frameRate": 30,
    "groupId": "...",
    "height": 480,
    "resizeMode": "none",
    "width": 640
  }
}
```

In this example `video.frameRate` is a supported property and it's actual value is 30.

`video.zoom` is not a supported property for this combination of device/camera/navigator as it is not present in the returned object.

## Stream Transformation

### Audio filters : noiseReduction - ApplyAudioProcessor()

{% hint style="info" %}
Noise reduction feature is available on apiRTC 5.0.0
{% endhint %}

ApiRTC allows to create a stream with a noise reduction filter.

{% hint style="info" %}
Check the [noise reduction tutorial](https://apirtc.github.io/ApiRTC-examples/streams_applyAudioProcessor/)
{% endhint %}

[`applyAudioProcessor`](https://apirtc.github.io/references/apirtc-js/Stream.html#applyAudioProcessor__anchor) helper manages the different stream states for you. (ie : switch from noisedReduction to normal mode)

To start the noise reduction process on a Stream:

```javascript
stream.applyAudioProcessor('noiseReduction').then((streamWithEffect) => {
...
})
```

This method returns a `streamWithEffect` Stream object; it is a encapsulated object of the base `stream` with an noise reduction filter applied on it.

It means that there both, the base `stream` and the `streamWithEffect` stream, are still linked :

* If the base `stream` audio is muted the `streamWithEffect` stream audio will be too,
* If the base `stream` is released, the `streamWithEffect` stream will be too.

Both streams need to be handled by the application as the noise reduction process is going on.

To stop the noise reduction process:

```javascript
// stop the noise reduction from base stream
stream.applyAudioProcessor('none').then((streamWithoutEffect) => {
...
})
```

{% hint style="warning" %}
If an error occurs during applyAudioProcessor() process, apiRTC will reject the promise but will try to restore stream with previous effect.

Error description is available in the [ApiRTC JS Library Reference](https://apirtc.github.io/references/apirtc-js/ApplyAudioProcessorError.html)
{% endhint %}

Additionally, ApiRTC gives you access to `Stream.startNoiseReduction` and `Stream.stopNoiseReduction` methods.

### Background subtraction : blur, background image - applyVideoProcessor()

ApiRTC allows to create a background blurred stream or to add a background image based on an original stream.

{% hint style="info" %}
Have you checked the [blur application tutorial](https://apirtc.github.io/ApiRTC-examples/streams_applyVideoProcessor/)?
{% endhint %}

[`applyVideoProcessor`](https://apirtc.github.io/references/apirtc-js/Stream.html#applyVideoProcessor) helper manages the different stream states. (ie : switch from blur to background image ...)

To start the blur process on a stream:

```javascript
stream.applyVideoProcessor('blur').then((streamWithEffect) => {
...
})
```

This method returns a `streamWithEffect` Stream object; it is a encapsulated object of the base `stream` with blur filter applied on it.

It means that there both, the base `stream` and the `streamWithEffect` stream, are still linked :

* If the base `stream` audio is muted the `streamWithEffect` stream audio will be too,
* If the base `stream` is released, the `streamWithEffect` stream will be too.

Both streams need to be handled by the application as the noise reduction process is going on.

Use the stream with effect as a local stream:

```javascript
// display streamWithEffect media stream by attaching to a media element (like <video>)
streamWithEffect.attachToElement(videoDomElement)

// publish the streamWithEffect stream
conversation.publish(streamWithEffect).then(() => {
  ...
});
```

To stop the blur process:

```javascript
// stop blur from original stream
stream.applyVideoProcessor('none').then((blurredStream) => {
...
})
```

Additionally, ApiRTC gives you access to `Stream.blur()`, `Stream.unblur()`, `Stream.backgroundImage()`, `Stream.unBackgroundImage()`.

## Whiteboard

The whiteboard component enables participants to interact together with:

* lines (`pen`)
* shapes (`arrow`, `rectangle` or `ellipse`)
* texts
* and also an eraser (`eraser`)

Lines weight and colors, and text size are configurable. Undo & redo functions are available (`whiteboardClient.undo` and `whiteboardClient.redo`). The whiteboard area can be zoomed in and out (`whiteboardClient.setScale`), and shifted around (`whiteboardClient.setOffset`). The whiteboard can be erased at once with the `whiteboardClient.deleteHistory` function.

Adding a whiteboard to a web page takes a few lines:

```javascript
  ...
    conversation.startNewWhiteboardSession('canvas-element-id');  // instanciate a whiteboard in a canvas
    whiteboardClient = userAgent.getWhiteboardClient(); //retrieve the Whiteboardclient object
    whiteBoardClient.setFocusOnDrawing(true); //The whiteboard follows the drawings done by other users if the canvas is set on a scrollable container
    ...
```

See the whiteboard in action in the following github repos:

* [Whiteboard example](https://github.com/ApiRTC/ApiRTC-examples/tree/master/conferencing_whiteboard)/[Demo](https://apirtc.github.io/ApiRTC-examples/conferencing_whiteboard/index.html)
* [Whiteboard with invite example](https://github.com/ApiRTC/ApiRTC-examples/tree/master/conferencing_whiteboard_invitation)/[Demo](https://apirtc.github.io/ApiRTC-examples/conferencing_whiteboard_invitation/index.html)
