Custom Query (2195 matches)

Filters
 
Or
 
  
 
Columns

Show under each result:


Results (1 - 3 of 2195)

1 2 3 4 5 6 7 8 9 10 11
Ticket Resolution Summary Owner Reporter
#2176 fixed Create stress test for timer heap nanang
Description

There have been some reports about crash in timer heap (there was #2172, but reported that crash still occurs even after integrating the patch). Creating a stress test for timer heap to reproduce the issue could be useful in identifying the problem and verifying a future solution.

Few notes after running the stress test

  • The stress test could reproduce the previously suspected issue of "pop_freelist()+push_freelist() after remove_node() in pj_timer_heap_poll()", here is a bit more description on it:

pop_freelist() will only preserve the timer_id, not the timer entry as the timer entry node has been completely detached from the heap. it may cause crash, as the number of free timer_id is less than timer heap slot, e.g: if the app has 4 timer polling threads, there can be 4 less free timer_id than timer heap slot, while the timer heap slot will grow only when there are 2 free timer heap slot left.

So r5934 removes pop_freelist() + push_freelist() after remove_node().

  • The issue described in #2172 (double destroy after cancel_timer() return code not checked) could not be reproduced, but later after a further investigation, found out that it may actually be a false alarm because the poll() sets timer entry group lock to NULL (within the same timer lock block as remove_node()) before invoking callback, so it should be safe from the double destroy issue.
#2181 fixed Video conference implementation nanang
Description

This ticket will implement video conference using centralized approach which is very similar to the existing audio conference, i.e: there will be an endpoint that act as the conference manager that is capable to establish multiple video calls, mix video from the participants (managed via connecting/disconnecting ports), and then send the mixing result to each participant. The video mixing operation is done by video conference bridge, which will use very similar APIs to the audio conference bridge.

Video conference bridge specifications

  • Video timing clock, just like audio, the video conference bridge will have a clock that schedules video flow between ports.
  • All registered ports must be in passive mode (at least for now), so the conference bridge will actively pull video frames from source ports and push video frames to sink ports.
  • Ports can be bidirectional (they are PJMEDIA ports anyway) and just like in audio, ports connection is unidirectional, from a source port to a sink port.
  • A source port may send to multiple sink ports (splitting), and a sink port may receive from multiple source ports (mixing). For splitting, there is no sink port number limitation, but for mixing currently it will only support up to four video sources (actually no limitation, but only the first four will be rendered).
  • Video mixing layout. For now it will be static and no customization, e.g:
    • two source ports will be rendered into left/right layout for a landscape sink port, or top/bottom layout for a portrait sink
    • three source ports will be rendered into left+(top/bottom) layout for a landscape sink, or top+(left/right) layout for a portrait sink
    • four source ports will be rendered into 2x2 layout (equally divided slices) for a landscape sink, or four vertically stacked slices for a portrait sink.

Implementation

Basically, the video conference implementation shares the same concept as the audio conference, so for applications that already have audio conference feature, integrating video conference feature should be relatively simple.

For participant, no modification is needed, it will see the video conference as a usual P2P video call, the video from other participants will be rendered into the same video window of the remote side of the video call (the conference manager).

For conference manager, it can setups the video conference using a very similar way as setting up an audio conference (with additional efforts for managing the UI perhaps). A sample code for this can be checked below.

Conference ports in a video window in PJSUA

A video window may be related to two or more conference ports, application should be aware of this so it can manage the conference properly.

A video window represents a video renderer device and is automatically registered to the video conference bridge. The conference slot ID can be queried via pjsua_vid_win_get_info(), the slot ID should be in pjsua_vid_win_info.slot_id. This is a sink port.

As a sink port, it normally has a source, for example a capturer device or a call video stream. The conference slot ID of the source port should be queried separately, for example:

  • For capture device, use pjsua_vid_preview_get_vid_conf_port(), the corresponding video window can be queried using pjsua_vid_preview_get_win().
  • For call video stream (first video stream), use pjsua_call_get_vid_conf_port(), the corresponding video window can be queried using pjsua_call_get_vid_win(). Alternatively, those info can also be queried via call info, i.e: pjsua_call_info.media[i].stream.vid.enc_slot/dec_slot, the corresponding video window is pjsua_call_info.media[i].stream.vid.win_in.

Sample usage: three parties video conference

Here are steps to setup a three parties video conference:

  1. Make two video calls as normal to two other participants.
  2. After all calls are established, simply connect those port IDs using pjsua_vid_conf_connect() for PJSUA or VideoMedia::startTransmit() for PJSUA2, e.g:
    • using PJSUA:
      /* Get video ports of call 1, there are two ports as in a video call as
       * encoding and decoding directions may not use the same frame rate or 
       * resolution.
       */
      call_1_dec_port = pjsua_call_get_vid_conf_port(call_1_id, PJMEDIA_DIR_DECODING);
      call_1_enc_port = pjsua_call_get_vid_conf_port(call_1_id, PJMEDIA_DIR_ENCODING);
      
      /* Get video ports of call 2 */
      call_2_dec_port = pjsua_call_get_vid_conf_port(call_2_id, PJMEDIA_DIR_DECODING);
      call_2_enc_port = pjsua_call_get_vid_conf_port(call_2_id, PJMEDIA_DIR_ENCODING);
      
      /* Connect video ports of call 1 and call 2.
       * Note that the source is the stream port in decoding direction,
       * and the sink is the stream port in encoding direction.
       */
      status = pjsua_vid_conf_connect(call_1_dec_port, call_2_enc_port, NULL);
      status = pjsua_vid_conf_connect(call_2_dec_port, call_1_enc_port, NULL);
      
    • using PJSUA2 (exception handling excluded):
      /* Get video ports of call 1 */
      VideoMedia call_1_dec_port = call1->getDecodingVideoMedia(-1);
      VideoMedia call_1_enc_port = call1->getEncodingVideoMedia(-1);
      
      /* Get video ports of call 2 */
      VideoMedia call_2_dec_port = call2->getDecodingVideoMedia(-1);
      VideoMedia call_2_enc_port = call2->getEncodingVideoMedia(-1);
      
      /* Connect video ports of call 1 and call 2 */
      VideoMediaTransmitParam transmit_param;
      call_1_dec_port.startTransmit(call_2_enc_port, transmit_param);
      call_2_dec_port.startTransmit(call_1_enc_port, transmit_param);
      
  3. At this point, each participant should be able to see video from the two other participants. The caller will see them in separate windows as it has two video calls (may be combined if preferred, see below), while the other two will only have one incoming video window as usual, just it contains a mixed video (from the caller and the other participant).
  4. On the caller side, it may combine video window of call 1 and call 2 into a single window (for simpler UI management perhaps), e.g:
    • using PJSUA:
      pjsua_vid_win_id wid1, wid2;
      pjsua_vid_win_info win2_info;
      
      /* Put incoming video stream from call 1 into call 2 window */
      wid2 = pjsua_call_get_vid_win(call_2_id);
      pjsua_vid_win_get_info(wid2, &win2_info);
      pjsua_vid_conf_connect(call_1_dec_port, wid2_info.slot_id, NULL);
      
      /* Now hide the video window of call 1 */
      wid1 = pjsua_call_get_vid_win(call_1_id);
      pjsua_vid_win_set_show(wid1, PJ_FALSE);
      
    • using PJSUA2:
      /* Function for querying any first video window of the specified call */
      VideoWindow getCallVideoWindow(const Call *call) {
          CallInfo ci = call->getInfo();
          CallMediaInfoVector::iterator it;
          for (it = ci.media.begin(); it != ci.media.end(); ++it) {
      	if (it->type == PJMEDIA_TYPE_VIDEO &&
      	    it->videoIncomingWindowId != PJSUA_INVALID_ID)
      	{
      	    return it->videoWindow;
      	}
          }
          return VideoWindow(PJSUA_INVALID_ID);
      }
      
      /* Put incoming video stream from call 1 into call 2 window */
      VideoWindow wid2 = getCallVideoWindow(call2);
      VideoMediaTransmitParam transmit_param;
      call_1_dec_port.startTransmit(wid2.getVideoMedia(), transmit_param);
      
      /* Now hide the video window of call 1 */
      VideoWindow wid1 = getCallVideoWindow(call1);
      wid1.Show(false);
      
#2228 fixed Bug in PCM shift in G722 nanang
Description

There are some issues:

  • Clipping detector (to stop shifting). The clip detector in the G722 decoder kicked in immediately when the decoding started, and that it seemed to be due to negative signal levels being fed into the clip detector. The current implementation of the G722 decoder returns values in the range [-16384, 16383]. The way the current clip detector is constructed it will always detect clipping on negative values because of the bit representation of negative values, i.e: MSB is always 1 in two's complement.
  • Simple shifting in encoding and decoding, due to bit representation of negative values, simple right/left bits shift may result in unexpected value, e.g: left shifting a positive value may get a negative value, right shifting a negative value will result in a positive value.

This ticket will fix the bugs in clipping detector and in encoding/decoding. Also update clipping detector so the shifting will be reduced step by step after a potential clipping is detected, instead of stopped immediately as in current implementation.

Thanks to Martin Navne for the report and the patch.

1 2 3 4 5 6 7 8 9 10 11
Note: See TracQuery for help on using queries.