cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Announcements

Community Helping Community

using MXP at live events

9
Views
0
Helpful
0
Comments
This document was generated from CDN thread

Created by: Marko Laurits on 18-01-2010 01:04:20 PM
Hello!

Last week I performed a videoconference at a customer's event. I have made videoconferences at dozens of events, now I would like to write all questions and comments that have come to my mind when doing that kind of work.

What I mean by word event: it is usually a public conference with hundreds of visitors or a show event where several towns / countries are linked together. One example of an event is annual Tandberg Patner event - several different connections were made in a hall with hundreds of participants. Expectations to the technology are usually higher at the events than at typical meeting rooms.

Our most recent event was a customer's party that took place in 3 locations. 3000 MXP with F8.2 was used as the bridge. Speed of both connections was 1152 kb/s.

1) The biggest problem at such events is packet loss because the connection is usually made over Internet. I don't understand why Tandberg has not developed any measures to cope packet loss. RTP and IPLR are not very helpful - you see pixelation already with 0.2% lost packets. Mathematical algorithms for recovering lost information by using redundancy at digital communication have existed for years. Polycom has LPR couple of years for now. Skype has used redundancy since beginning, AFAIK. Tandberg wants to be a videoconferencing pioneer but has nothing for now.

2) On one hand 3000 MXP feels very powerful. It can operate in full-screen switching mode for public screens, at the same time to show split-screen multi-site image for event organisers at second screen and similar image with OSD menus for technicians at third screen. That's very good!
At the same time 3000 MXP feels quite a weak when handling multi-site video.

First thing you note is that as soon as you add the third participant, the resolution drops from 448p to CIF. I understand that to perform transcoding, you need to encode all streams independently and therefore you need to divide the processing power between all participants. Also, if one participant encounters packet loss and requests an I-frame, you don't want to influence other streams and that's the second reasons for generating separate streams.

Videoconferencing endpoints from previous generations did not perform any transcoding / rate matching but sent exactly similar streams to every participant. In case all participants are Tandberg MXP-s and use the same speed, IMHO "the old way" would be preferred sometimes because when all participants are similar devices, and have the same speed, the bridge could create one stream for all participants and this one stream could have better resolution. Moreover - when full screen layout is used then theoretically it isn't necessary to re-encode the image, it could be just forwarded :-?

3) Second thing that little-bit annoys customers, is multi-site layout at 3-way conferences. I mean layout where all three participants are equally important. Tandberg classic showed one window at top and two windows bottom. It was more pleasant than current possibility where one corner is black. Why one-above-two-below layout is not supported at MXP? Would it require more processing capabilities or it's just question of programming?

At 3000 MXP you can use PictureProgram option to show different custom layouts locally. That's very nice. That brings another question: why can't these used as multi-site layouts to be shown to all participants?

4) As this blank quarter was little-bit distracting, we tried to put a bitmap image there instead. We set up a PC with PVX. It has an option to send a bitmap when no camera is connected. That brought along new questions.

First, we tried to use as little bandwidth for the bitmap as possible. Total bandwidth of 3000 MXP is 2.3 Mb/s, so we needed to keep video channels with two locations as fast as possible. You can dial a call with speed 1152 kb/s or 768 kb/s. But how to initiate a call with an intermediate speed? E.g 1024 kb/s?

5) As a workaround, we initiated two calls with 1152 kb/s and then third call at low speed. As a result, MXP automatically lowers the speed as much as necessary to enable the fourth participant. First, we tried 128 kb/s connection to the PC. As the result - resolution of continuous presence image was lowered from PAL to CIF. My conclusion is that though MXP does transcoding and rate matching, all participants have to receive image with the same resolution. If one participant has low-speed connection and is able to receive only CIF then all participants will receive CIF continuous presence. Is that correct?

6) Then we tried to make 256 kb/s connection to the PC. As the result, speed of one connection was reduced from 1152 to ~896 kb/s. That was as expected. However the stream sent to this participant was 256 kb/s. Statout command at the bridge showed this value. The endpoint with slower connection showed that incoming connection speed is 8xx kb/s, however the video rate was ~200 kb/s.

Why's that? Does it mean that MXP is actually unable to create 3 independent video streams with different speeds? Actually only 2 video streams are generated?

7) Due to above we could not use the fourth participant (768 kb/s per location would have been too slow) and we decided to leave one quarter black. Sometimes we showed 4-split continuous presence, sometimes one location was shown in full screen (by using Floor command). Once in full-screen mode I saw another surprise.

There was a dance show in one location that was presented on the screen at other places. Obviously, a fast dance is not a good thing to transmit via videoconference. Anyway, it was done at this event (and has been sometimes before). MXP operating as a bridge was in the same town where the dance show took place. I personally saw the dance over video in another location. The image was quite "slow" as you can imagine. To my surprise - the video stream from the dance-floor was only 400...500 kb/s while the connection speed was 1152 kb/s. Video stream in opposite direction was ~1000 kb/s as expected.
Why did the MXP use only ~400 kb/s to transmit fast-moving detailed image if it could use ~1000 kb/s and achieve therefore better quality? Does it mean that the encoding algorithm is not very well optimised? In case the image is detailed and fast, the processor lacks power to encode the image optimally?

I switched between continuous presence and voice switched layout but it did not have any influence. (This layout was changed only at location which was "on air" because other locations saw full-screen image due to Floor command anyway).

8) As mentioned above, we used Floor command to choose, which participant can be seen in every location. Until version F5 one can use 'mcucommand floor...' command or acutally 'mcuc fl...' abbreviated version. Unfortunately, at newer versions this command does not work any more. You need to use 'xCommand FloorToSite MCUID: 1 Site:....' command. This small bug is not important if you use Crestron / AMX. However, at live events one usually uses SSH / Telnet to change all settings as fast as possible. Of course, one can copy 'xcom floo mcuid: 1 site: ' to clipboard to make the changnes faster. However, it would be nice, if short commands would be available. Because in live show-events, every second counts.

9) Please could you explain little-bit AAC-LD-128-Mono option. If this is set on, then mono audio is transmitted at 128 kb/s, right?
Does 128 and 64 kb/s AAC-LD compare to each other like 128 and 64 kb/s MP3 to each other? I.e 128 kb/s should give much better quality than 64 kb/s, especially when transmitting music?

Is 128 kb/s stereo principally 2 x 64 kb/s AAC-LD, i.e no joint stereo coding is used at all?

10) Speaking about stereo - do I understand correctly that you can transmit only line-in audio as left and right channels?
Is microphone sound mixed to both channels equally?
Is there any possibility to send microphone 1 as left and microphone 2 as right audio channel (or vice versa)? If not then is it a hardware limitation?

11) Tandberg F7 or newer can show participant name on continuous presence window. But at full screen lay-out no name is displayed. The customer asked us to display the name in full-screen view as well. Hopefully this would be available in future versions.

12) If you are a technician in a live event, you are mostly interested in two information screens: packet loss and audio input indicators. The problem is that switching between these two screens is very time consuming, especially opening audio input level indicators. These indicators are quite important because it is easy to exceed clip level. I see two possible ways to make it better accessible.

First way would be including input level indicators at status screen. You can use OK-up key combination as a shortcut to the status screen and then right-left keys to navigate between bit-rate and packet loss tabs. That's very handy. It would be helpful, if audio input (and output) levels would be shown at one tab as well.

Second way would be accessing audio settings screen by single key-press. Right now you can configure 10 camera presets to be accessed by pressing a number key. It would be helpful you could use these shortcut keys for setting presets as well as opening some settings. E.g some users need to switch often between PC image quality / sharpness settings. It would be nice if they could open the page of this settings with one click.

Btw, are these audio input levels measured before or after AGC?

Thank you very much in advance if you can answer all these questions.

Marko

Subject: RE: using MXP at live events
Replied by: Roger Boe on 19-01-2010 02:33:14 PM
Hi Marko,

This forum is mainly for developers and integrators and your questions are mostly support related. In the future I will urge you to contact your local TANDBERG representative for such questions. However I will try to answer all your questions below.

Marko Laurits:


1) The biggest problem at such events is packet loss because the connection is usually made over Internet. I don't understand why Tandberg has not developed any measures to cope packet loss. RTP and IPLR are not very helpful - you see pixelation already with 0.2% lost packets. Mathematical algorithms for recovering lost information by using redundancy at digital communication have existed for years. Polycom has LPR couple of years for now. Skype has used redundancy since beginning, AFAIK. Tandberg wants to be a videoconferencing pioneer but has nothing for now.
[Marko


As you said we have IPLR and packet loss downspeeding. I cannot comment on future development in this forum.

Marko Laurits:

2) On one hand 3000 MXP feels very powerful. It can operate in full-screen switching mode for public screens, at the same time to show split-screen multi-site image for event organisers at second screen and similar image with OSD menus for technicians at third screen. That's very good!
At the same time 3000 MXP feels quite a weak when handling multi-site video.

First thing you note is that as soon as you add the third participant, the resolution drops from 448p to CIF. I understand that to perform transcoding, you need to encode all streams independently and therefore you need to divide the processing power between all participants. Also, if one participant encounters packet loss and requests an I-frame, you don't want to influence other streams and that's the second reasons for generating separate streams.

Videoconferencing endpoints from previous generations did not perform any transcoding / rate matching but sent exactly similar streams to every participant. In case all participants are Tandberg MXP-s and use the same speed, IMHO "the old way" would be preferred sometimes because when all participants are similar devices, and have the same speed, the bridge could create one stream for all participants and this one stream could have better resolution. Moreover - when full screen layout is used then theoretically it isn't necessary to re-encode the image, it could be just forwarded :-?
[Marko


The system has two encoders and doesn't do a full individual transcoding. Two encoders means that it can cope with two different incoming bandwidths. So if you have three calls at 384, 768 and 1152kbps and the initial call rate was 1152kbps, the 384 and 768 kbps site will receive the same video as they will share the second encoder. The 768kbps site will then receive 384kbps call quality. The system has a split screen setting in the Call Quality/Video Quality menu. If you set this one to "Sharpness", the unit will prefer sharpness over motion in multisite. The system will then typically prefer a w576p image (1024*576 pixels) depending on the remote sites capabilities.

Marko Laurits:

3) Second thing that little-bit annoys customers, is multi-site layout at 3-way conferences. I mean layout where all three participants are equally important. Tandberg classic showed one window at top and two windows bottom. It was more pleasant than current possibility where one corner is black. Why one-above-two-below layout is not supported at MXP? Would it require more processing capabilities or it's just question of programming?

At 3000 MXP you can use PictureProgram option to show different custom layouts locally. That's very nice. That brings another question: why can't these used as multi-site layouts to be shown to all participants?
[Marko


The layout is just a matter of design. The MXP is designed this way, it has nothing to do with processing power. The picture program is limited on the MXP, such features are available with the new C series platform.

Marko Laurits:

4) As this blank quarter was little-bit distracting, we tried to put a bitmap image there instead. We set up a PC with PVX. It has an option to send a bitmap when no camera is connected. That brought along new questions.

First, we tried to use as little bandwidth for the bitmap as possible. Total bandwidth of 3000 MXP is 2.3 Mb/s, so we needed to keep video channels with two locations as fast as possible. You can dial a call with speed 1152 kb/s or 768 kb/s. But how to initiate a call with an intermediate speed? E.g 1024 kb/s?
[Marko


The MXP bandwidth selection was designed around the concept of ISDN BRI channels. 768kbps is equal to 12 ISDN B-channels. The next legal step up using ISDN bonding was 18 ISDN B-channels = 1152kbps. Because of this the MXP doesn't have any intermediate speed.


Marko Laurits:

5) As a workaround, we initiated two calls with 1152 kb/s and then third call at low speed. As a result, MXP automatically lowers the speed as much as necessary to enable the fourth participant. First, we tried 128 kb/s connection to the PC. As the result - resolution of continuous presence image was lowered from PAL to CIF. My conclusion is that though MXP does transcoding and rate matching, all participants have to receive image with the same resolution. If one participant has low-speed connection and is able to receive only CIF then all participants will receive CIF continuous presence. Is that correct?
[Marko


See comment for point 2 (ref two encoders).

Marko Laurits:

6) Then we tried to make 256 kb/s connection to the PC. As the result, speed of one connection was reduced from 1152 to ~896 kb/s. That was as expected. However the stream sent to this participant was 256 kb/s. Statout command at the bridge showed this value. The endpoint with slower connection showed that incoming connection speed is 8xx kb/s, however the video rate was ~200 kb/s.

Why's that? Does it mean that MXP is actually unable to create 3 independent video streams with different speeds? Actually only 2 video streams are generated?
[Marko


Correct, ref point 2.

Marko Laurits:

7) Due to above we could not use the fourth participant (768 kb/s per location would have been too slow) and we decided to leave one quarter black. Sometimes we showed 4-split continuous presence, sometimes one location was shown in full screen (by using Floor command). Once in full-screen mode I saw another surprise.

There was a dance show in one location that was presented on the screen at other places. Obviously, a fast dance is not a good thing to transmit via videoconference. Anyway, it was done at this event (and has been sometimes before). MXP operating as a bridge was in the same town where the dance show took place. I personally saw the dance over video in another location. The image was quite "slow" as you can imagine. To my surprise - the video stream from the dance-floor was only 400...500 kb/s while the connection speed was 1152 kb/s. Video stream in opposite direction was ~1000 kb/s as expected.
Why did the MXP use only ~400 kb/s to transmit fast-moving detailed image if it could use ~1000 kb/s and achieve therefore better quality? Does it mean that the encoding algorithm is not very well optimised? In case the image is detailed and fast, the processor lacks power to encode the image optimally?

I switched between continuous presence and voice switched layout but it did not have any influence. (This layout was changed only at location which was "on air" because other locations saw full-screen image due to Floor command anyway).
[Marko


Since I don't know the whole setup and don't have any status etc of the call I can't say for certain. The MXP do have optimized encoding, but this sounds to me like the limitation described in point 2 above.

Marko Laurits:

8) As mentioned above, we used Floor command to choose, which participant can be seen in every location. Until version F5 one can use 'mcucommand floor...' command or acutally 'mcuc fl...' abbreviated version. Unfortunately, at newer versions this command does not work any more. You need to use 'xCommand FloorToSite MCUID: 1 Site:....' command. This small bug is not important if you use Crestron / AMX. However, at live events one usually uses SSH / Telnet to change all settings as fast as possible. Of course, one can copy 'xcom floo mcuid: 1 site: ' to clipboard to make the changnes faster. However, it would be nice, if short commands would be available. Because in live show-events, every second counts.
[Marko


You are referring to the old classic API. This API was and is still available for making the transition for integrators between the old TANDBERG Classic platform and the MXP easier. However we have stressed that the new XACLI (Xcommand Advanced Command Line Interface) should be used. Now that we have reached F8, we do expect integrators to use the new API provided by TANDBERG. This API is optimized for machine to machine communication and is also more flexible then the older API.

Marko Laurits:

9) Please could you explain little-bit AAC-LD-128-Mono option. If this is set on, then mono audio is transmitted at 128 kb/s, right?
Does 128 and 64 kb/s AAC-LD compare to each other like 128 and 64 kb/s MP3 to each other? I.e 128 kb/s should give much better quality than 64 kb/s, especially when transmitting music?

Is 128 kb/s stereo principally 2 x 64 kb/s AAC-LD, i.e no joint stereo coding is used at all?
[Marko


Myself has not been able to hear any noticeable difference between 128kbps mono and 64kbps mono. However in theory and with a good speaker system you may be able to hear a difference. With stereo you have 2*64kbps full AAC-LD channels available. However stereo will only be output by the far end system if it has a DNAM connected, or if the microphones locally are muted. This is due to a limitation when it comes to stereo echo canceling.

Marko Laurits:

10) Speaking about stereo - do I understand correctly that you can transmit only line-in audio as left and right channels?
Is microphone sound mixed to both channels equally?
Is there any possibility to send microphone 1 as left and microphone 2 as right audio channel (or vice versa)? If not then is it a hardware limitation?
[Marko


It is not possible to send the microphones as two separate channels. This is not a hardware limitation.

Marko Laurits:

11) Tandberg F7 or newer can show participant name on continuous presence window. But at full screen lay-out no name is displayed. The customer asked us to display the name in full-screen view as well. Hopefully this would be available in future versions.
[Marko


We don't comment upon future versions in this forum.

Marko Laurits:

12) If you are a technician in a live event, you are mostly interested in two information screens: packet loss and audio input indicators. The problem is that switching between these two screens is very time consuming, especially opening audio input level indicators. These indicators are quite important because it is easy to exceed clip level. I see two possible ways to make it better accessible.

First way would be including input level indicators at status screen. You can use OK-up key combination as a shortcut to the status screen and then right-left keys to navigate between bit-rate and packet loss tabs. That's very handy. It would be helpful, if audio input (and output) levels would be shown at one tab as well.

Second way would be accessing audio settings screen by single key-press. Right now you can configure 10 camera presets to be accessed by pressing a number key. It would be helpful you could use these shortcut keys for setting presets as well as opening some settings. E.g some users need to switch often between PC image quality / sharpness settings. It would be nice if they could open the page of this settings with one click.

Btw, are these audio input levels measured before or after AGC?

Thank you very much in advance if you can answer all these questions.
[Marko


You should contact your TANDBERG representative for feature requests. We don't handle feature requests in this forum. The audio input levels are measured before AGC.

Best regards
Roger

Subject: RE: using MXP at live events
Replied by: Marko Laurits on 03-06-2010 08:45:38 PM
Hello and thanks for the answer!

Roger Boe:
This forum is mainly for developers and integrators and your questions are mostly support related.

Yes, but  from here one can get most informative answers very promptly. Your message proved that
I entered also feature wishes to the support site.

The system has two encoders and doesn't do a full individual transcoding.

That was a bit surprise because I had not seen this mentioned in any technical documentation.
Though I doubted it as discussed in point 6.

Two encoders means that it can cope with two different incoming bandwidths. So if you have three calls at 384, 768 and 1152kbps and the initial call rate was 1152kbps, the 384 and 768 kbps site will receive the same video as they will share the second encoder. The 768kbps site will then receive 384kbps call quality.

I understand that MXP needs two encoders to compress two different images in case of voice switched or 5+1 layout. Person that is "on air" receives different image than rest of participants.

An example:
Site D is a multi-site host, connecting to sites A, B and C
Site A is connected at 384 kb/s
Site B is connected at 768 kb/s
Site C is connected at 1152 kb/s

Voice switched layout, site B is "on air".
Site B sees site A, C or D (whoever was previously active site) with speed 768 kb/s
Site A and C see site B talking at 384 kb/s as they share the same encoder.
Site D sees site B at 768 kb/s and encoded once. Other participants see video and hear sound that has decoded twice (and with lower speed) and therefore experience lower quality.

Is the above correct?

The system has a split screen setting in the Call Quality/Video Quality menu. If you set this one to "Sharpness", the unit will prefer sharpness over motion in multisite. The system will then typically prefer a w576p image (1024*576 pixels) depending on the remote sites capabilities.

I have read from WR that MXP is able to encode approximately 448p at 30 f/s. 448p resolution is ~230 k-pixels. w576p resolution means ~590 k-pixels. If MXP encodes two different streams at w576p then it needs to encode ~1180 k-pixels. That is 5 times more than one 448p stream. Does it mean that streams are encoded 6 f/s?
Where is mistake in this logic?

However stereo will only be output by the far end system if it has a DNAM connected, or if the microphones locally are muted. This is due to a limitation when it comes to stereo echo canceling.

That's interesting. I have not found such information from manuals. Is it me or do you have better documentation than we (Tandberg partners) do?
I'd like to read the same documentation you do. Then I would not waste your time by stupid questions

But I don't yet understand how DNAM helps. Is it somehow involved in echo cancellation process?

It is not possible to send the microphones as two separate channels. This is not a hardware limitation.

It's related to echo cancellation, as the above point?

Greetings and thanks again!

Subject: RE: using MXP at live events
Replied by: Wanis Elabbar on 02-07-2010 03:40:47 PM
Marko, when it comes to packet loss, IPeaknetworks have developed a solution that can stop this from happening its called IPQ:

http://www.ipeaknetworks.com/video-conferencing

youtube of IPQ in action
http://www.youtube.com/watch?v=Kio9948kAgE
CreatePlease to create content
Content for Community-Ad
FusionCharts will render here