wiki:Getting-Started/iPhone

Getting Started: Building for Apple iPhone, iPad and iPod Touch

This document has moved to: https://docs.pjsip.org/en/latest/get-started/ios/build_instructions.html

  1. Features
  2. Requirements
  3. Build Preparation
  4. Building PJSIP
    1. Supporting multiple architectures (armv6, armv7, armv7s, arm64, and so on)
    2. Setting minimum supported iOS version
    3. Simulator
    4. Bitcode
  5. Using PJSIP in your application
    1. PJSIP in Swift application
  6. Video Support
    1. Features
    2. Requirements
    3. Configuring
    4. Video capture orientation support
  7. TLS/OpenSSL Support
  8. Common problems
    1. PushKit guide, to accept calls in the background after kCFStreamNetworkServiceTypeVoIP is deprecated (iOS 16/iOS 10/iOS 9)
    2. CallKit integration and audio session (AVAudioSession) management (iOS 10)
    3. Crash after calling PJLIB APIs using Grand Central Dispatch (GCD)
    4. Audio lost or other issues with interruption (by a phone call or an alarm), headset plug/unplug, or Bluetooth input
    5. SIP transport keepalive while in background
  9. Other Problems (problem specific to a particular iOS version/device)
    1. Unable to accept incoming call in background mode (iOS 8 or before)

Features

Some of the features of the iPhone port:

  • it has a native CoreAudio based audio device, which supports the following features:
    • the built-in/device's echo canceller
    • output volume setting
    • input latency setting
    • output latency setting
  • supports for the built-in iLBC codec
  • video

Requirements

  • iOS SDK, part of Xcode.

Build Preparation

  1. Get the source code, if you haven't already.
  2. Set your config_site.h to the following:
    #define PJ_CONFIG_IPHONE 1
    #include <pj/config_site_sample.h>
    
    This will activate iPhone specific settings in the config_site_sample.h.

Building PJSIP

Just run:

$ cd /path/to/your/pjsip/dir
$ ./configure-iphone
$ make dep && make clean && make

Open ipjsua.xcodeproj using Xcode in pjproject/pjsip-apps/src/pjsua/ios. If you enable video and use libyuv/libopenh264, add the libraries into the application. Build the project and run. You will see telnet instructions on the device's screen. Telnet to this address to operate the application. See PJSUA CLI Manual for commands available.

Notes:

  • the ./configure-iphone is a wrapper that calls the standard ./configure script with settings suitable for iPhone target.
  • the latest iPhone SDK version will be selected by default. You may change this by setting IPHONESDK environment variable to the desired SDK path. For ipjsua, select Project-Edit Project Settings-Base SDK and Targets-ipjsua-Get Info-Base SDK to change the SDK version.
  • you may pass standard ./configure options to this script too.
  • for more info, run ./configure-iphone --help
  • other customizations are similar to what is explained in Building with GNU page.

Supporting multiple architectures (armv6, armv7, armv7s, arm64, and so on)

You need to compile separately for each architecture by setting ARCH environment variable to the desired architecture before running {{{configure-iphone}}. For example:

export ARCH="-arch arm64"

Then you need to combine the resulting libraries using the lipo command. For example:

lipo -arch armv6 lib/armv6/libpjlib.a -arch armv7 lib/armv7/libpjlib.a -create -output lib/libpjlib.a

Setting minimum supported iOS version

If you want to specify the minimum supported iOS version, you can set MIN_IOS environment variable before running {configure-iphone, for example:

export MIN_IOS="-miphoneos-version-min=8.0"

The default setting is iOS 7.0. If you don't want to specify this flag, you can set MIN_IOS to a single space instead (export MIN_IOS=" ") Note that if you don't set the minimum iOS version, you may encounter linker warning in your XCode app, which may lead to crashes when running on older iOS versions:

ld: warning: object file (...) was built for newer iOS version (10.0) than being linked (7.0)

Simulator

To configure the build system for the iPhone simulator:

export DEVPATH=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer
# arm64 simulator
ARCH="-arch arm64" CFLAGS="-O2 -m64" LDFLAGS="-O2 -m64" MIN_IOS="-mios-simulator-version-min=13.0" ./configure-iphone
# x86_64 simulator
ARCH="-arch x86_64" CFLAGS="-O2 -m64" LDFLAGS="-O2 -m64" MIN_IOS="-mios-simulator-version-min=13.0" ./configure-iphone
# or 32-bit
# ARCH="-arch i386" CFLAGS="-O2 -m32" LDFLAGS="-O2 -m32" MIN_IOS="-mios-simulator-version-min=13.0" ./configure-iphone
make dep && make clean && make

Note that the exact paths may vary according to your SDK version.

Bitcode

To enable bitcode, use the following steps:

  1. In running the configure script, add -fembed-bitcode to CFLAGS, e.g: CFLAGS=-fembed-bitcode ./configure-iphone.
  2. Run make.
  3. In XCode, ipjsua -> Build Settings, Search "bitcode" -> set "Enable Bitcode" to "Yes".
  4. Build.

Note that any third-party dependencies, e.g: OpenSSL, will need to be built with bitcode enabled too.


Using PJSIP in your application

To use PJSIP in your application, you need to:

  • Add the required libraries and frameworks. One way to do this is by drag-and-dropping the libraries and frameworks from our sample app. Then add the library and header search paths in "Build Settings".
  • Add the required permissions for camera (if you need video calls) and microphone usages.
  • Define PJ_AUTOCONF=1 in your Xcode's project config.

PJSIP in Swift application

  • For Swift app, you need to create a bridging header (click File-New-Objective-C File, and click Yes when asked to create a bridging header). In the bridging header file, add all the C headers that you need, for example: #import <PJSIP/pjsua.h>. You can then directly call any PJSIP C API declared in those headers. If you want to use C++ API such as PJSUA2 however, you need to create your own Objective-C wrapper. For a sample Swift app, please check ipjsua-swift.xcodeproj located in pjproject/pjsip-apps/src/pjsua/ios-swift (note that this sample Swift app requires video support).

Video Support

Features

  • native capture
  • native preview
  • native OpenGL ES renderer
  • H.264 codec (using native VideoToolbox framework or OpenH264 library, see below)

Requirements

libyuv

  1. If you are using 2.5.5 or newer, libyuv should be built and enabled automatically, see ticket #1937 for more info.
  2. If you are using 2.5.1 or older, follow the instructions in ticket #1776.

OpenH264 or VideoToolbox (if you need H264 codec, choose one of them)

  • For OpenH264, follow the instructions in ticket #1947.
  • For VideoToolbox (supported since PJSIP version 2.7), define this in your config_site.h: #define PJMEDIA_HAS_VID_TOOLBOX_CODEC 1

libvpx (if you need VP8 or VP9 codec)

libvpx.

Configuring

Sample invocation of ./configure-iphone:

$ ./configure-iphone --with-openh264=/Users/me/opt

If you use openh264, make sure it is detected by ./configure-iphone:

...
Using OpenH264 prefix... /Users/me/opt
checking OpenH264 availability... ok
...

Set these in your config_site.h:

#define PJ_CONFIG_IPHONE 			1
#define PJMEDIA_HAS_VIDEO			1

#include <pj/config_site_sample.h>

Video capture orientation support

To send video in the proper orientation (i.e. head always up regardless of the device orientation), application needs to do the following:

  1. Setup the device to get orientation change notification (by calling the API UIDevice.beginGeneratingDeviceOrientationNotifications and add a callback to receive UIDeviceOrientationDidChangeNotification).
  2. Inside the callback, call PJSUA API
    pjsua_vid_dev_set_setting(dev_id, PJMEDIA_VID_DEV_CAP_ORIENTATION, &new_orientation, PJ_TRUE)
    
    to set the video device to the correct orientation.

For sample usage, please refer to ipjsua sample app. Ticket #1861 explains this feature in detail.


TLS/OpenSSL Support

Native TLS backend for iOS and MacOS, i.e: using Network framework, is supported, please check GitHub PR #2482 for more info.

Alternatively, using OpenSSL backend is also supported. Follow the instructions below to enable TLS transport by using OpenSSL:

  1. Build and install OpenSSL-1.1.x, please check this OpenSSL wiki. For example, to build for arm64 architecture:
    export CROSS_TOP=/Applications/XCode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/
    export CROSS_SDK=iPhoneOS11.3.sdk
    export CC="/Applications/XCode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -arch arm64"
    ./Configure iphoneos-cross --prefix=/Users/teluu/openssl-1.1.0f-iphone64/arm64
    make
    make install
    

And check that OpenSSL is detected by the configure script:

...
checking for OpenSSL installations..
checking openssl/ssl.h usability... yes
checking openssl/ssl.h presence... no
aconfigure: WARNING: openssl/ssl.h: accepted by the compiler, rejected by the preprocessor!
aconfigure: WARNING: openssl/ssl.h: proceeding with the compiler's result
checking for openssl/ssl.h... yes
checking for ERR_load_BIO_strings in -lcrypto... yes
checking for SSL_library_init in -lssl... yes
OpenSSL library found, SSL support enabled
...
  1. Build the libraries:
    make dep && make
    
  2. In XCode project setting of your application (for example, ipjsua), add libssl.a and libcrypto.a from OpenSSL ARM directory to the project's Libraries:
    1. In Group & Files pane, expand ipjsua, then right click Libraries, and select Add -> Existing Files....
    2. Find libssl.a and libcrypto.a from OpenSSL ARM directory (for example, ${HOME}/openssl/openssl_arm) and add them to the project.
  3. Build the app

Common problems

PushKit guide, to accept calls in the background after kCFStreamNetworkServiceTypeVoIP is deprecated (iOS 16/iOS 10/iOS 9)

Starting in iOS 9, kCFStreamNetworkServiceTypeVoIP is deprecated. Apple recommends that applications use VoIP Push Notifications (using PushKit framework) to avoid persistent connections as described in the Apple's official doc. This will require application to implement the setup and handling of push notifications in the application layer (for more details, you can refer to ticket #1941). For now, PJSIP will still use kCFStreamNetworkServiceTypeVoIP, if you want to disable it right away, you can set PJ_IPHONE_OS_HAS_MULTITASKING_SUPPORT to 0.

Starting from iOS 13, there's a new requirement: Apps receving VoIP pushes must post an incoming call (via CallKit or IncomingCallNotifications) in the same run loop as pushRegistry:didReceiveIncomingPushWithPayload:forType:[withCompletionHandler:] without delay. Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Killing app because it never posted an incoming call to the system after receiving a PushKit VoIP push callback.' In order to make it work with the normal SIP flow which may require you to wait for some time to receive the INVITE message, please look at Apple's recommendation in its developer forum.

Starting from iOS 16, using VoIP socket (or kCFStreamNetworkServiceTypeVoIP) will cause app getting crashed or killed, under debugger it will show a message like below in the call stack:

"Linked against modern SDK, VOIP socket will not wake. Use Local Push Connectivity instead"

Application using PJSIP 2.12.1 or older can use the following settings in config_site.h to avoid that:

  1. Leave PJ_IPHONE_OS_HAS_MULTITASKING_SUPPORT to 1 (note that by default it is already 1), this is for maintaining UDP-replacing-socket feature which is still required.
  2. Set PJ_ACTIVESOCK_TCP_IPHONE_OS_BG to 0, or alternatively call pj_activesock_enable_iphone_os_bg(PJ_FALSE) before creating any SIP transport or any PJSIP socket in general, this is for disabling the VOIP socket.

CallKit integration and audio session (AVAudioSession) management (iOS 10)

CallKit requires application to configure audio session and start the call audio at specific times. Thus, to ensure a smooth integration, we disable the setup of audio session in our sound device wrapper to avoid conflict with application's audio session setting. Starting from ticket #1941, application needs to set its own audio session category, mode, and activation/deactivation.

Here could be used as a quick start reference:

Crash after calling PJLIB APIs using Grand Central Dispatch (GCD)

PJLIB API should be called from a registered thread, otherwise it will raise assertion such as "Calling pjlib from unknown/external thread...". With GCD, we cannot really be sure of which thread executing the PJLIB function. Registering that thread to PJLIB seems to be a simple and easy solution, however it potentially introduces a random crash which is harder to debug. Here are few possible crash scenarios:

  • PJLIB's pj_thread_desc should remain valid until the registered thread stopped, otherwise crash of invalid pointer access may occur, e.g: in pj_thread_check_stack().
  • Some compatibility problems between GCD and PJLIB, see #1837 for more info.

If you want to avoid any possibility of blocking operation by PJLIB (or any higher API layer such as PJMEDIA, PJNATH, PJSUA that usually calls PJLIB), instead of dispatching the task using GCD, the safest way is to create and manage your own thread pool and register that thread pool to PJLIB. Or alternatively, simply use PJSUA timer mechanism (with zero delay), see pjsua_schedule_timer()/pjsua_schedule_timer2() docs for more info.

Audio lost or other issues with interruption (by a phone call or an alarm), headset plug/unplug, or Bluetooth input

It has been reported that any time an audio interruption happens, audio is lost until the application is killed/restarted.

Here is the reported working solution:

  • application should be configured to receive interruption events, see Apple's AVAudioSession doc.
  • forcefully shutdown the sound device when interruption begins, e.g: using pjsua_set_no_snd_dev()) for pjsua, or AudDevManager.setNoDev() for pjsua2
  • restart the sound device after interruption ends, e.g: using pjsua_set_snd_dev() for pjsua, or AudDevManager.setPlaybackDev()+setCaptureDev() for pjsua2.

Also note this is the recommended outline of the normal flow for audio interruption:

  • on interruption begin
    1. hold the calls
    2. stop any other media if any (i.e. disconnect all connections in the bridge)
    3. by default, sound device will be stopped after some idle period after there is no connection in the bridge, or alternatively just forcefully shutdown the sound device.
  • on interruption end
    1. unhold the calls
    2. resume any other media if any
    3. if sound device was not shutdown forcefully, first connection to the bridge will cause sound device to be started, otherwise manual restarting the sound device, by setting playback & capture device, is required.

SIP transport keepalive while in background

As the process is normally suspended when application is in the background, the worker thread that handles TCP keepalive timer is also suspended. So basically application needs to schedule periodic wakeup to allow the library send TCP keep-alive. Sample code:

- (void)keepAlive {
    /* Register this thread if not yet */
    if (!pj_thread_is_registered()) {
        static pj_thread_desc   thread_desc;
        static pj_thread_t     *thread;
	pj_thread_register("mainthread", thread_desc, &thread);
    }

    /* Simply sleep for 5s, give the time for library to send transport
     * keepalive packet, and wait for server response if any. Don't sleep
     * too short, to avoid too many wakeups, because when there is any
     * response from server, app will be woken up again (see also #1482).
     */
    pj_thread_sleep(5000);
}

- (void)applicationDidEnterBackground:(UIApplication *)application
{
    /* Send keep alive manually at the beginning of background */
    pjsip_endpt_send_raw*(...);

    /* iOS requires that the minimum keep alive interval is 600s */
    [application setKeepAliveTimeout:600 handler: ^{
	[self performSelectorOnMainThread:@selector(keepAlive)
              withObject:nil waitUntilDone:YES];
    }];
}

Make sure that keepalive feature of SIP transport is not disabled, see PJSIP_TCP/TLS_KEEP_ALIVE_INTERVAL docs, and the keepalive interval is set to less than 600s.

Alternatively, configuring server to send keepalive ping packet, if possible, and client responds back by sending keepalive pong to the server, so we have two-way traffic. As there is no way to detect incoming ping from server, currently application can just always send pong packet whenever it becomes active (application will be woken up when receiving TCP packet), e.g: send pong packet in UIApplication::applicationDidBecomeActive().

Other Problems (problem specific to a particular iOS version/device)

Unable to accept incoming call in background mode (iOS 8 or before)

Starting in iOS 9, this method to accept incoming call in bg is deprecated, please have a look at #bg-call.

If while in the background, ipjsua (or your application) is unable to detect if there is an incoming call and display the local notification:

  1. Note that background feature only works with TCP.
  2. Make sure that voip is included in the required background modes (UIBackgroundModes) in the application’s Info.plist file.
  3. Make sure that the TCP socket is successfully wrapped with CFReadStreamRef (check if there is a message: "Failed to configure TCP transport for VoIP usage").
  4. Check whether you can accept the incoming call by bringing the app to the foreground. If yes, make sure that the incoming call request comes from the wrapped TCP socket (check the log for the INVITE request).

Note: these steps do not troubleshoot audio problems.

Last modified 15 months ago Last modified on Jan 25, 2023 3:36:04 AM

Attachments (1)

Download all attachments as: .zip