1 Introduction

The Integrator's Guide for Jabra devices helps you correctly integrate call control into your softphone application, which in turn can improve the user experience.

End-users can control calls in different ways, for example by pressing physical buttons on the Jabra device, or lifting the boom arm, and others, by consequence placing full control of calls at the user’s fingertips.

1.1 How to use this guide

If you are not familiar with integration of Jabra devices, it is recommended that you read this guide from beginning to end - in chronological order - to explore basic concepts and read important notes that can help you correctly integrate call control.

If you have previous experience integrating call control into Jabra devices, you can jump directly to the relevant chapter or section using the left navigation bar.

The following guides can help you avoid common pitfalls during integration:

  • Integrator's Guide
    Explains the concepts behind call control in relation to Jabra devices. It is useful to read this guide from beginning to end before you begin any integration work.
  • Developer's Guide
    Focuses on code samples and API calls more than concepts. This guide is more specific and contains more details.

If you encounter any issues or have feedback, create a support ticket in the Jabra Developer Zone page.

In addition, for any information regarding terminology, you can refer to the Jabra glossary.

1.2 Before you begin

Before you begin integration with Jabra devices, it is strongly recommended that you read the following reference topics:

1.2.1 Understanding event signals

To make a correct integration of the call control API, you must understand the following key concept regarding events:

Events sent from the device to the application are not simple keypresses. There is no one-to-one relation between pressing a button on the Jabra device and the application receiving an event.

Instead, events sent from the device can be triggered in different circumstances and are signals notifying the application that something has happened.

One circumstance can be that the user expresses an intent to do something by interacting with the Jabra device, for example pressing a physical button.

Another trigger can be an acknowledgment to a request from the application or an error condition, such as the headset being out of range from the base station.

Therefore, you cannot interpret events as only button presses.

The names of the callback functions registered during initialization (see chapter 3 - The Software Development Kit) can give reason to believe that they represent button presses, however, this is not the case.

Certain callback function names or function parameters might emphasize the misunderstanding that an event corresponds to a button press. While this misunderstanding may not have a major impact on Mute signals, it does affect the request/acknowledge-style interactions described in the Request / acknowledge interactions section in this guide.

The following is an example and not an exhaustive list of these callback function names or function parameters:

  • ButtonInDataTranslatedFunc (C)
  • TranslatedButtonInput (C#)
  • buttonInData (C)
  • TranslatedButtonInputEventArgs (C#)

Both the signal and the acknowledgement are the same; this is especially important for the off-hook function. If the application is written with a generic keypress handler, then ending a call from the application by sending an on-hook command to the device results in a 'ghost keypress' arriving from the device up to two seconds later.

This, however, is not a keypress/button press, but rather the acknowledgment from the device to the off-hook command from the application.

1.2.2 Supported platforms

Integration is supported on all major platforms, including web browsers. For details, refer to the Integration Components page.