Changes between Version 13 and Version 14 of Nokia_APS_VAS_Direct


Ignore:
Timestamp:
Feb 23, 2009 10:48:07 AM (15 years ago)
Author:
bennylp
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Nokia_APS_VAS_Direct

    v13 v14  
    3131=== What APS-Direct is really === 
    3232 
    33 To use APS-Direct means that you're opening the sound device in codec (i.e. non-PCM) mode. You '''still have the choice''', at run-time, to open the sound device in PCM mode, if you wish, for example to make use of the PCM features in PJMEDIA such as the tone generator, or if you want to use the software codecs such as Speex or G.722 in PJMEDIA. Note that if you use PJSUA-LIB then the management of closing/re-opening the sound device using the correct codec may be done by PJSUA-LIB automatically. 
     33To use APS-Direct means that you're opening the sound device in codec (i.e. non-PCM) mode. You '''still have the choice''', at run-time, to open the sound device in PCM mode, if you wish, for example to make use of the PCM features in PJMEDIA such as the tone generator, or if you want to use the software codecs such as Speex or G.722 in PJMEDIA. Note that if you use PJSUA-LIB then the management of closing/re-opening the sound device using the correct codec will be done by PJSUA-LIB automatically. 
    3434 
    35 To use APS-Direct also means that you are restricted to use the ''audio switchboard'' at compile time (audio switchboard will be explained later). This means that you loose the capability of mixing audio with PJSUA-LIB, as well as several other restrictions on your audio routing arrangements. Among other things, you can't have two calls active and connected to the audio device at the same time. You can have more than one calls, but one of them must be put on-hold. 
     35To use APS-Direct also means that you are restricted to use the [#switchboard audio switchboard] at compile time (audio switchboard will be explained later). This means that you loose the capability of mixing audio with PJSUA-LIB, as well as several other restrictions on your audio routing arrangements. Among other things, you can't have two calls active and connected to the audio device at the same time. You can have more than one calls, but only one can be active (we recommend to put the other call on hold in this case). 
    3636 
    37 The sound device can only handle one format at a time. For example, if it is currently opened with G.729 format, you can't reconnect it to different media ports (such as stream or other pjmedia_port). You must first close it, then re-open it using the correct format. 
     37The sound device can only handle one format at a time. For example, if it is currently opened with G.729 format, you can't reconnect it to different media ports (such as stream or other pjmedia_port). You must first close it, then re-open it using the correct format. Note that if you are using PJSUA-LIB, this will be handled automatically (i.e. PJSUA-LIB will close/reopen the sound device with the correct format depending on what is connected to port zero in the [#switchboard audio switchboard]). But still you are limited to only able to connect one media port to the sound device at the same time (e.g. you cannot hear the call and play ringback tone to the sound device simultaneously). 
    3838 
    3939 
     
    4444 - similarly when passthrough codec is selected in the stream, the stream will emit and take encoded audio frames (rather than PCM frames), hence it needs APS-Direct on the other side. 
    4545 
    46 One important thing to note: '''you may still use software codecs such as Speex and G.722 even when your application is compiled with APS-Direct support'''. When one of these software codecs is selected to be used by the stream, the stream will work as usual (i.e. emitting and taking PCM audio frames), so the audio device '''must''' be opened in normal/PCM mode (i.e. non-APS-Direct mode). 
     46One important thing to note: '''you may still use software codecs such as Speex and G.722 even when your application is compiled with APS-Direct support'''. When one of these software codecs is selected to be used by the stream, the stream will work as usual (i.e. emitting and taking PCM audio frames), so the audio device '''must''' be opened in normal/PCM mode (i.e. non-APS-Direct mode). If you are using PJSUA-LIB, then again this will be handled automatically. 
     47 
     48 
     49 
     50[[BR]] 
     51 
     52== Using APS-Direct or VAS-Direct == 
     53 
     54Currently only APS-Direct is implemented, and here are the steps to build the application with APS- Direct feature. 
     55 
     56 1. Enable the appropriate sound device implementation. Currently only [http://trac.pjsip.org/repos/wiki/APS] and native WMME sound device backends support APS-Direct feature. 
     57 1. Enable audio switch board, i.e. in config_site.h: 
     58 {{{ 
     59#define PJMEDIA_CONF_USE_SWITCH_BOARD   1 
     60 }}} 
     61 1. Selectively enable/disable which PCM codecs to be supported, for example to disable all PCM codecs: 
     62 {{{ 
     63#define PJMEDIA_HAS_G711_CODEC          0 
     64#define PJMEDIA_HAS_L16_CODEC           0 
     65#define PJMEDIA_HAS_GSM_CODEC           0 
     66#define PJMEDIA_HAS_SPEEX_CODEC         0 
     67#define PJMEDIA_HAS_ILBC_CODEC          0 
     68#define PJMEDIA_HAS_G722_CODEC          0 
     69#define PJMEDIA_HAS_INTEL_IPP           0 
     70 }}} 
     71 1. Enable passthrough codecs, and selectively enable/disable which passthrough codecs to be supported. The passthrough codecs supported would depend on which codecs are supported by the sound device backends that you choose to use: 
     72 {{{ 
     73#define PJMEDIA_HAS_PASSTHROUGH_CODECS  1 
     74 
     75// Disable all passthrough codecs except PCMA and PCMU 
     76#define PJMEDIA_HAS_PASSTHROUGH_CODEC_PCMU      1 
     77#define PJMEDIA_HAS_PASSTHROUGH_CODEC_PCMA      1 
     78#define PJMEDIA_HAS_PASSTHROUGH_CODEC_AMR       0 
     79#define PJMEDIA_HAS_PASSTHROUGH_CODEC_G729      0 
     80#define PJMEDIA_HAS_PASSTHROUGH_CODEC_ILBC      0 
     81 }}} 
     82 1. If you are using PJSUA-LIB, then relatively that's all needed to make use of APS-Direct. Please note that the application logic must take care that there can only be one source transmitting to any destination in the switchboard. 
    4783 
    4884 
     
    5187== Changes == 
    5288 
    53 The use of APS-Direct and VAS-Direct is very different than traditional PJMEDIA media processing, with the main difference being the audio frames returned by/given to the sound device are now in encoded format rather than in raw PCM format. The following changes will be done in order to support this. 
     89The use of APS-Direct and VAS-Direct is very different than traditional PJMEDIA media processing, with the main difference being the audio frames returned by/given to the sound device are now in encoded format rather than in raw PCM format. The following changes have been done in order to support this. 
    5490 
    5591 
     
    6096 
    6197 {{{ 
    62  typedef enum pjmedia_format {  
     98 typedef enum pjmedia_format_id {  
    6399   PJMEDIA_FORMAT_PCM, 
    64100   PJMEDIA_FORMAT_ALAW, 
     
    67103   PJMEDIA_FORMAT_AMR_NB, 
    68104   .. 
     105 } pjmedia_format_id; 
     106 
     107 /** Media format information. */ 
     108 typedef struct pjmedia_format 
     109 { 
     110    pjmedia_format_id   id; 
     111    pj_uint32_t         bitrate; 
     112    pj_bool_t           vad; 
    69113 } pjmedia_format; 
    70114 }}} 
    71115 
    72  
    73 Support for new frame type (the {{{enum pjmedia_frame_type}}}): '''{{{PJMEDIA_FRAME_TYPE_EXTENDED}}}'''. When the frame's type is set to this type, the {{{pjmedia_frame}}} structure may be typecasted to '''{{{pjmedia_frame_ext}}}''' struct (new): 
     116We also need to support passing around non-PCM frames in PJMEDIA. We added support for new frame type (the {{{enum pjmedia_frame_type}}}): '''{{{PJMEDIA_FRAME_TYPE_EXTENDED}}}'''. When the frame's type is set to this type, the {{{pjmedia_frame}}} structure may be typecasted to '''{{{pjmedia_frame_ext}}}''' struct (new): 
    74117 
    75118 {{{ 
     
    90133 }}} 
    91134 
     135The format information has also been added to '''pjmedia_port_info''' structure. 
    92136 
    93 The stream must also support non-PCM audio frames in its {{{get_frame()}}} and {{{put_frame()}}} port interface.  
     137The stream must also support non-PCM audio frames in its {{{get_frame()}}} and {{{put_frame()}}} port interface. The stream will set the correct format in its '''pjmedia_port_info''' structure depending on the codecs being used (i.e. if passthrough codec is being used, the format will contain non-PCM format information). 
    94138 
    95139=== Passthrough Codecs === 
    96140 
    97 While the actual codec encoding/decoding will be done by APS/VAS, "dummy" codec instances still need to be created in PJMEDIA: 
     141While the actual codec encoding/decoding will be done by the sound device, "dummy" codec instances still need to be created in PJMEDIA: 
    98142 - PJMEDIA needs the list of codecs supported to be offered/negotiated in SDP, 
    99143 - some codecs have special framing requirements which are not handled by the hardware codecs, for example the framing rules of AMR codecs ([http://tools.ietf.org/html/rfc3267 RFC 3267]). 
     
    110154 
    111155 
    112 === New Audio Switchboard (the non-mixing conference bridge) === 
     156=== New Audio Switchboard (the non-mixing conference bridge) === #switchboard 
    113157 
    114 Since audio frames are forwarded back and forth in encoded format, obviously the traditional conference bridge would not be able to handle it. A new object will be added, we call this audio switchboard ({{{conf_switch.c}}}) and it's API will be compatible with the existing conference bridge API, so that it can replace the bridge in the application by compile time switch. 
     158Since audio frames are forwarded back and forth in encoded format, the traditional conference bridge would not be able to handle it. A new object will be added, we call this audio switchboard ({{{conf_switch.c}}}) and it's API will be compatible with the existing conference bridge API to maintain compatibility with existing applications/PJSUA-LIB. 
    115159 
    116160Understandably some conference bridge features will not be available: 
     
    125169 - much more lightweight (footprint and performance), 
    126170 - supports routing audio from ports with different ''ptime'' settings. 
    127  
    128  
    129  
    130  
    131  
    132 [[BR]] 
    133  
    134 == Using APS-Direct or VAS-Direct == 
    135  
    136 Currently only APS-Direct is implemented, and here are the steps to build the application with APS- Direct feature. 
    137  
    138  1. Enable APS sound device implementation as described [http://trac.pjsip.org/repos/wiki/APS here]. 
    139  1. Enable audio switch board, i.e. in config_site.h: 
    140  {{{ 
    141 #define PJMEDIA_CONF_USE_SWITCH_BOARD   1 
    142  }}} 
    143  1. Enable passthrough codecs, i.e. in config_site.h: 
    144  {{{ 
    145 #define PJMEDIA_HAS_PASSTHROUGH_CODECS  1 
    146  }}} 
    147  
    148 For building sample application {{{symbian_ua}}}, those steps are enough since it's already prepared to use APS- Direct.  
    149  
    150 For general application, there are few more things to be handled: 
    151  - Reopening sound device when it needs to change the active format/codec, e.g: when a call is confirmed and stream has been created, the sound device format should be matched to the SDP negotiation result. Here is the sample code for application that using pjsua-lib, reopening sound device is done in {{{on_stream_created()}}} pjsua callback, this will replace the precreated pjsua-lib sound device instance: 
    152  {{{ 
    153 /* Global sound port. */ 
    154 static pjmedia_snd_port *g_snd_port; 
    155  
    156 /* Reopen sound device on on_stream_created() pjsua callback. */ 
    157 static void on_stream_created(pjsua_call_id,  
    158                               pjmedia_session *sess, 
    159                               unsigned stream_idx,  
    160                               pjmedia_port**) 
    161 { 
    162     pjmedia_port *conf; 
    163     pjmedia_session_info sess_info; 
    164     pjmedia_stream_info *strm_info; 
    165     pjmedia_snd_setting setting; 
    166     unsigned samples_per_frame; 
    167  
    168     /* Get active format for this stream, based on SDP negotiation result. */     
    169     pjmedia_session_get_info(sess, &sess_info); 
    170     strm_info = &sess_info.stream_info[stream_idx]; 
    171  
    172     /* Init sound device setting based on stream info. */ 
    173     pj_bzero(&setting, sizeof(setting)); 
    174     setting.format = strm_info->param->info.format; 
    175     setting.bitrate = strm_info->param->info.avg_bps; 
    176     setting.cng = strm_info->param->setting.cng; 
    177     setting.vad = strm_info->param->setting.vad; 
    178     setting.plc = strm_info->param->setting.plc; 
    179  
    180     /* Close sound device and get the conference port. */ 
    181     conf = pjsua_set_no_snd_dev(); 
    182      
    183     samples_per_frame = strm_info->param->info.clock_rate * 
    184                         strm_info->param->info.frm_ptime * 
    185                         strm_info->param->info.channel_cnt / 
    186                         1000; 
    187  
    188     /* Reset conference port attributes. */ 
    189     conf->info.samples_per_frame = samples_per_frame; 
    190     conf->info.clock_rate = strm_info->param->info.clock_rate; 
    191     conf->info.channel_count = strm_info->param->info.channel_cnt; 
    192     conf->info.bits_per_sample = 16; 
    193  
    194     /* Reopen sound device. */ 
    195     pjmedia_snd_port_create2(app_pool,  
    196                              PJMEDIA_DIR_CAPTURE_PLAYBACK, 
    197                              0, 
    198                              0, 
    199                              strm_info->param->info.clock_rate, 
    200                              strm_info->param->info.channel_cnt, 
    201                              samples_per_frame, 
    202                              16, 
    203                              &setting, 
    204                              &g_snd_port); 
    205  
    206     /* Connect sound to conference port. */ 
    207     pjmedia_snd_port_connect(g_snd_port, conf); 
    208 } 
    209  }}} 
    210  - Note that sound device instance is now owned and managed by application, so  
    211  
    212 {{{pjsua_media_config.snd_auto_close_time}}} will not work. Here is the sample code to utilize  
    213  
    214 {{{on_stream_destroyed()}}} pjsua callback as the trigger of closing the sound device: 
    215  {{{ 
    216 /* Close sound device on on_stream_destroyed() pjsua callback. */ 
    217 static void on_stream_destroyed(pjsua_call_id, 
    218                                 pjmedia_session*,  
    219                                 unsigned) 
    220 { 
    221     if (g_snd_port) { 
    222         pjmedia_snd_port_destroy(g_snd_port); 
    223         g_snd_port = NULL; 
    224     } 
    225 } 
    226  
    227  }}} 
    228171 
    229172