Changes between Version 7 and Version 8 of Nokia_APS_VAS_Direct


Ignore:
Timestamp:
Feb 16, 2009 2:30:36 PM (15 years ago)
Author:
bennylp
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Nokia_APS_VAS_Direct

    v7 v8  
    66 
    77 
    8 The '''APS-Direct''' and '''VAS-Direct''' is our codenames for functionalities to use the hardware codecs that are supported by Nokia APS and/or VAS directly, bypassing media processing in PJMEDIA. These features will be introduced gradually beginning in PJSIP version 1.1. 
     8The '''APS-Direct''' and '''VAS-Direct''' is our codenames for functionalities to use the hardware codecs that are supported by Nokia APS and/or VAS directly, bypassing media processing in PJMEDIA. These features will be  introduced gradually beginning in PJSIP version 1.1. 
    99 
    1010APS stands for '''Audio Proxy Server''', and it is available as plug-in for Nokia S60 3rd Edition up to Feature Pack 2 version. This has been deprecated for FP2 devices and above, and it is being replaced by '''VoIP Audio Services (VAS)''', which is available as plug-in for S60 FP1/FP2 devices and will be available built-in in later S60 versions. 
     
    1313[[BR]] 
    1414 
    15 == Introduction == 
     15= Introduction = 
    1616 
    1717The Nokia APS and VAS support codecs such as G.711 (PCMA and PCMU), G.729, iLBC, and AMR-NB, though the availability of these codecs may vary according to the handset types. There are significant benefits of using these codecs instead of software codecs (in PJMEDIA-CODEC), with the main benefits are performance (hardware vs software codecs, latency) and the given codec licenses/royalties. 
     
    1919Due to these benefits, the ability to use these codecs in PJSIP applications is very desirable. 
    2020 
    21 Note that non-APS codecs can still be used as usual, e.g: GSM, Speex/8000. 
    22  
    23 [[BR]] 
    24  
    25 == Changes == 
     21Note that non-APS codecs can still be used as usual, e.g: GSM, Speex/8000, and G.722. 
     22 
     23[[BR]] 
     24 
     25= Concept = 
     26 
     27Before starting working with APS-Direct, please make sure you understand the concepts behind APS-Direct so that you can design the application appropriately. 
     28 
     29The whole point of APS-Direct is to enable end-to-end '''encoded audio format media flow''', that is from microphone device down to network/socket and from network/socket to the speaker device. This may sound obvious, but it has the following serious implications which will impact your application design. 
     30 
     31 '''No mixing''' :: 
     32 
     33  As will later explained, we have developed a new variant of conference bridge called ''audio switchboard''. This object has the same API as the bridge, but it lacks the mixing capability of the bridge. The implication of this is you can't have two slots transmitting to the same slot in the switchboard. 
     34 
     35  So you can't have two calls with active and connected to the audio device at the same time. You can have more than one calls, but one of them must be put on-hold. 
     36 
     37  '''One format rule''' :: 
     38   
     39  The sound device can only handle one format at a time, meaning that if it is currently opened with G.729 format (for one call for example), you can't feed it with PCM frames for example from the tone generator or PCM WAV files. 
     40 
     41  All PJMEDIA features that work with PCM audio will no longer work if the audio device is currently opened in codec mode. This includes the tone generator (tonegen) and WAV files. If you wish to use any of the features above, you must close the sound device and re-open it in PCM mode. 
     42 
     43 
     44 
     45 
     46[[BR]] 
     47 
     48= Changes = 
    2649 
    2750The use of APS-Direct and VAS-Direct is very different than traditional PJMEDIA media processing, with the main difference being the audio frames returned by/given to the sound device are now in encoded format rather than in raw PCM format. The following changes will be done in order to support this. 
    2851 
    2952 
    30 === Non-PCM Media Ports === 
    31  
    32 Media ports may now support non-PCM media, and this is signaled by adding a new "format" field in the {{{pjmedia_port_info}}}. The "format" field will use FOURCC identification. For now only stream that will have non-PCM capability. 
    33  
    34  {{{ 
    35  typedef union pjmedia_fourcc {  
    36    pj_uint32_t  u32; 
    37    char         c[4]; 
    38  } pjmedia_fourcc; 
    39  }}} 
    40  
    41  
    42  
    43 === New Frame Type to Carry Encoded Frames === 
     53== Non-PCM Format == 
     54 
     55 
     56Media ports may now support non-PCM media, and this is signaled by adding a new "format" field in the {{{pjmedia_port_info}}}.  
     57 
     58 {{{ 
     59 typedef enum pjmedia_format {  
     60   PJMEDIA_FORMAT_PCM, 
     61   PJMEDIA_FORMAT_ALAW, 
     62   PJMEDIA_FORMAT_ULAW, 
     63   PJMEDIA_FORMAT_G729, 
     64   PJMEDIA_FORMAT_AMR_NB, 
     65   .. 
     66 } pjmedia_format; 
     67 }}} 
     68 
    4469 
    4570Support for new frame type (the {{{enum pjmedia_frame_type}}}): '''{{{PJMEDIA_FRAME_TYPE_EXTENDED}}}'''. When the frame's type is set to this type, the {{{pjmedia_frame}}} structure may be typecasted to '''{{{pjmedia_frame_ext}}}''' struct (new): 
     
    6388 
    6489 
    65 === (Symbian) Sound Device API === 
     90The stream must also support non-PCM audio frames in its {{{get_frame()}}} and {{{put_frame()}}} port interface.  
     91 
     92== Passthrough Codecs == 
     93 
     94While the actual codec encoding/decoding will be done by APS/VAS, "dummy" codec instances still need to be created in PJMEDIA: 
     95 - PJMEDIA needs the list of codecs supported to be offered/negotiated in SDP, 
     96 - some codecs have special framing requirements which are not handled by the hardware codecs, for example the framing rules of AMR codecs ([http://tools.ietf.org/html/rfc3267 RFC 3267]). 
     97 
     98Passthrough codecs will be implemented for: PCMA, PCMU, iLBC, G.729, and AMR-NB. 
     99 
     100 
     101== (Symbian) Sound Device API == 
    66102 
    67103The APS/VAS based sound device backends will support additional APIs: 
     
    70106 - audio routing to loudspeaker or earpiece (this API is already available) 
    71107 
    72 === New Audio Switchboard (the non-mixing conference bridge) === 
     108 
     109== New Audio Switchboard (the non-mixing conference bridge) == 
    73110 
    74111Since audio frames are forwarded back and forth in encoded format, obviously the traditional conference bridge would not be able to handle it. A new object will be added, we call this audio switchboard ({{{conf_switch.c}}}) and it's API will be compatible with the existing conference bridge API, so that it can replace the bridge in the application by compile time switch. 
     
    87124 
    88125 
    89 === Non-PCM Format for Stream === 
    90  
    91 The stream must also support non-PCM audio frames in its {{{get_frame()}}} and {{{put_frame()}}} port interface. A new "format" field will be added to {{{pjmedia_stream_info}}} to let the application controls the audio format. The format is in FOURCC. 
    92  
    93  
    94 === Passthrough Codecs === 
    95  
    96 While the actual codec encoding/decoding will be done by APS/VAS, "dummy" codec instances still need to be created in PJMEDIA: 
    97  - PJMEDIA needs the list of codecs supported to be offered/negotiated in SDP, 
    98  - some codecs have special framing requirements which are not handled by the hardware codecs, for example the framing rules of AMR codecs ([http://tools.ietf.org/html/rfc3267 RFC 3267]). 
    99  
    100 Passthrough codecs will be implemented for: PCMA, PCMU, iLBC, G.729, and AMR-NB. 
    101  
    102  
    103 [[BR]] 
    104  
    105 == Using APS-Direct or VAS-Direct == 
    106  
    107 Currently, it's only APS-Direct that has been implemented, here are the steps to build an application with APS-Direct feature. 
     126 
     127 
     128 
     129[[BR]] 
     130 
     131= Using APS-Direct or VAS-Direct = 
     132 
     133Currently only APS-Direct is implemented, and here are the steps to build the application with APS- Direct feature. 
    108134 
    109135 1. Enable APS sound device implementation as described [http://trac.pjsip.org/repos/wiki/APS here]. 
     
    117143 }}} 
    118144 
    119 For building sample application {{{symbian_ua}}}, those steps are enough since it's already prepared to use APS-Direct.  
     145For building sample application {{{symbian_ua}}}, those steps are enough since it's already prepared to use APS- Direct.  
    120146 
    121147For general application, there are few more things to be handled: 
     
    179205} 
    180206 }}} 
    181  - Note that sound device instance is now owned and managed by application, so {{{pjsua_media_config.snd_auto_close_time}}} will not work. Here is the sample code to utilize {{{on_stream_destroyed()}}} pjsua callback as the trigger of closing the sound device: 
     207 - Note that sound device instance is now owned and managed by application, so  
     208 
     209{{{pjsua_media_config.snd_auto_close_time}}} will not work. Here is the sample code to utilize  
     210 
     211{{{on_stream_destroyed()}}} pjsua callback as the trigger of closing the sound device: 
    182212 {{{ 
    183213/* Close sound device on on_stream_destroyed() pjsua callback. */ 
     
    198228 
    199229 
    200 == References == 
     230= References = 
    201231 
    202232Internal documentations: 
     
    212242 
    213243 
     244