For Larix apps general setup and usage questions please check Larix documentation reference to see full list of instructional articles and videos for various platforms and servers.
Some specific questions are answered below.
Q1: How does adaptive bitrate (ABR) work in Larix Broadcaster?
Adaptive video bitrate is supported in 3 modes:
- Logarithmic descend – gracefully descend from max bitrate down step by step. Retries to raise back to previous step every minute. Best fit for good networks.
- Ladder ascend – first cut bitrate by 2/3 and increase it back to normal as much as possible. Retries to raise back to previous steps in 15 seconds, 1.5 and then 5 minutes. Best fit for networks with big losses.
- Hybrid approach calculates percentage of factually delivered packets and decreases the target bitrate by that ratio. Minimum bitrate is 25%. Larix tries to restore the bitrate every 30 seconds by 500Kbps steps.
For TCP protocols (RTMP/RTSP), Hybrid mode also takes into account latency (period of time to send all queued data).
For SRT it only depends on ratio between amount of data to send and amount of actually sent.
On iOS it supports RTMP, RTSP and SRT. On Android Hybrid is supported only for SRT. - Variable FPS can be used as an option, it will reduce bitrate by decreasing FPS in addition to changing the bitrate value.
The trigger for start switching to lower bitrate or frame rate is the number of lost packets per certain period of time.
For logarithmic descent it’s 4 packets per 10 last seconds.
For Ladder ascend it equals “bitrate/300000” for the last 10 seconds, e.g. for 2Mbps it’s 6 packets.
Example for hybrid: you set target bitrate to 6000Kbps, the actual outgoing bitrate is 5000Kbps. Due to network failures the real delivery bitrate is dropped by 50% to 2500Kbps. So the target bitrate is reduced buy half to 3000Kbps. A minute later it tries to restore it to 3500Kbps.
In case of RTMP and RTSP connections we count lost packets’ stats ourselves.
On iOS 11+, the packets are not lost but are kept in system buffer, so if the ABR is not used, there will be an increase it delivery delay.
In case of SRT the packet loss is defined by pktSndDrop property and it depends on how SRT handled the loss in accordance to latency and other internal factors.
Q2: Larix frame rate issues: no 24FPS, no 60FPS, wrong FPS etc
A typical issue: Larix FPS is set to 25FPS, but decoder on receiver side shows 30FPS or no FPS data at all.
First of all, mobile encoders do not add proper SPS information into the content which cause some decoders to get confused and to use some default value like “30”. There’s no way to set the definite FPS at the moment. The output content will not have proper information in SPS, the encoder just doesn’t provide it. That’s why we cannot provide it as well.
Larix always uses system encoder so we cannot precisely control the framerate.
On iOS platform the produced framerate may be variable, we cannot control it. At the moment we can only use the encoder setting which “recommends” the encoder to use certain frame rate.
This is how it’s described in Apple docs: “This is not used to control the frame rate; it is provided as a hint to the video encoder so that it can set up internal configuration before compression begins. The actual frame rate will depend on frame durations and may vary.”
The same applies to Android. There we can select some frame rate range from pre-defined a list of rate rages supported by the encoder. Some ranges may contain just one value (30..30) – that a “fixed frame rate”, some contain ranges (1..25) – that’s a “variable frame rate”. But that’s also a recommendation.
Android 60FPS support notice: most of Android devices with 60 fps cameras do not provide this capability to third-party apps, so only in native camera app can do it. So if your device has that support, most probably Larix won’t be able to use it.
Known issues: on Samsung S21 Ultra with Qualcomm chip (usually shipped for North American market) most streaming apps including Larix will not stay at 30fps or any other frame rate. After a few seconds of encoding it goes down to 15fps regardless of any settings. This must be the hardware encoder issue which we unfortunately cannot control.
Q3: How does Larix Broadcaster handle bitrate setup?
On Android the bitrate parameter is defined by just typing the value.
On iOS the bitrate parameter has predefined set of values, and by default it’s selected based on the resolution.
In general, once you define a bitrate (or use default one), the device encoder will use it as target bitrate for encoding and Larix will publish the stream with that bitrate. If network conditions get worse, then you will see frames loss at some point for RTMP and RTSP. SRT will try compensating that with its error recovery within the “latency” parameter period.
If you know that your network will not be fine, you can enable Adaptive bitrate feature.
Bitrate matches resolution option is also available in case you don’t know the exact value. The following rules apply here:
- Bitrate is selected according to this table based on resolution:
[“2160”:4500, “1080”:3000, “720”:2000, “540”:1500, “480”:1000, “360”:700, “288”:500, “144”:300] - If you use HEVC then the bitrate is multiplied by 0.5.
- If you use 50FPS and above then the bitrate is multiplied by 1.6.
Q4: Can I make my application perform streaming from the background?
How can we do streaming when the app is closed, is in background or when the device is locked?
On Android it’s available as non-default feature. In order to use it, go to “Advanced options” menu and enable “Background streaming”. After the app is restarted, your app will work in background. Also notice “Quit if inactive in background” option.
On iOS the background mode puts a lot of limitations. Encoder and camera are not accessible, only audio is available. Larix Screencaster is built as application extension and cannot be used as a foundation for full-features background streaming app.
Q5: My Android device has multiple rear and frontal cameras, how can I use them?
Larix shows all cameras which are available via system API. Android 10+ provides extended capabilities for capturing from physical cameras and Larix has that support. However, many manufacturers allow using additional cameras mostly only in their own apps. So if some device doesn’t provide this information, Larix won’t be able to work with it.
Q6: What is the minimum latency for outgoing stream?
Glass-to-glass latency between a sensor of device and the playback device consists of multiple components. We described it in this article to cover all stages of delivery. Mobile device hardware, Larix Broadcaster and delivery protocol are responsible for the first few stages and Larix can control only a part of them.
Device’s sensor and encoder add some latency but it’s insignificant related to later stages, it’s just dozens of milliseconds.
Interleaving buffer. Having frames from encoder, Larix streaming library first collects them in a buffer to perform interleaving compensation. This step is needed to align video and audio frames for better processing on receiver side. Interleaving buffer usually collects about 16 frames of audio and video, for 30FPS video this means about 250ms delay. Using Larix SDK, it’s possible to disable interleaving compensation but we strongly recommend to keep it working to avoid issues on your receiver side.
Then there is a buffer that prepares data for sending via a designated protocol. So further parts depend on a protocol. Also, each platform may have its specifics.
Notice that Larix is capable of streaming via multiple simultaneous connections so each connection will have its own latency.
iOS-specific buffering.
For RTMP and RTSP currently the buffer is 200 items, which is 3000ms at maximum. However the latency may be reduced if the moment of stream’s start is near the first frame of GOP. The GOP size is 2000ms so there’s a high chance of getting lower latency. You may reduce it in SDK via library method.
For SRT and RIST, data from interleaving buffer is sent directly to respective protocol libraries, so no additional delay is introduced there. However, SRT has “latency” parameter and RIST has “buffer” for their own packets loss compensation algorithms. Read respective protocols’ docs to learn more about their respective recommended parameters’ values.
Android-specific buffering. RTMP, RTSP, SRT and RIST take data from the same buffer which is 200 items, or 3000ms. You may reduce it in Larix app via Advanced options / Enable custom encoder buffer menu item, as well as via library method.
If you use only SRT or RIST, you may reduce buffer down to 250ms for interleaving compensation purposes. Just like it was mentioned for iOS above, SRT has “latency” parameter and RIST has “buffer” for their own packets loss compensation algorithms.
The approaches described above may change over time because our team keeps improving them.
Q7: Why does low bitrate setting not take effect?
Q: I specified low bitrate in the Video encoding settings, but my stream doesn’t honor it. Why is the bitrate much higher?
This is the way quality floor enforcement works on Android. It always tries to make a perfect quality picture or save details in fast motion scenes by increasing bitrate. There’s no way to control it in VBR encoding mode, which most mobiles are equipped with and used by default.
You can find more details in official Android documentation:
If you need low bitrate regardless of the quality, please look for a mobile device with CBR encoding support, and specify it in Larix encoder parameters among with the low bitrate settings.
Q8: How can I stream from Larix to OBS on my desktop PC over mobile network?
Usually, LAN’s traffic is separated from the WAN by a Router. The OBS computer is in LAN, but the mobile device has access via WAN. To make mobile connect to some computer in your LAN, a special translation rule must be created for redirecting connections from WAN to LAN. This rule simply tells the router to redirect connection from a WAN IP:port to some LAN computer IP and port. The rule is set on a Router and can be created in the “NAT” or “Port forwarding” section. When creating rules, define UDP protocol for SRT or RIST, and TCP for RTMP or RTSP connections.
The setup details of this NAT rule heavily depends on a router model and network configuration. Routers are different, so check their respective manuals for details on creating translation rules.
Q9: Cannot stream in my local network from Larix for iOS.
Trying to stream over local WiFi network without Internet access to a local RTMP server but get “No Internet connection” error. All devices use manual IPs and there is no DHCP nor DNS server.
This is a known issue of latest iOS version. You must specify any DNS server settings (e.g. 1.1.1.1) to have a local WiFi connection, even without an actual Internet connection.
Q10: USB OTG camera (UVC) does not work on my Android, any chance to make it work?
We’ve put together UBS OTG support overview to answer this question.
Q11: What do Advanced connections settings mean?
Settings > Connections > Advanced settings menu has the following settings:
- Max. active connections option means maximum amount of simultaneously streamed connections. This value can limit the number of connection in Connections menu. By default it’s “3”.
- Reconnect timeout is the time in seconds, Larix will wait before trying to connect again.
- Don’t check network presence. By default, the stream won’t start if no network is available, and no retries are performed if the network goes occasionally offline. If this option is enabled, it allows connecting regardless of network presence either when starting or resuming stream.
- Reconnect timeout w/o network Reconnect timeout is the time in seconds, which Larix will wait before trying to connect again. If set to Never, Larix will never perform reconnection.
- Idle timeout: Larix will stop streaming if no data can be sent to a publishing point during this period, it’s set in seconds.
- Unsent threshold (iOS only) defines time in seconds for storing unsent data. If more data is accumulated in Larix’s buffer, Larix will stop streaming.
Q12: Where can I find saved files after video recording?
On iOS, the recordings can be saved to iCloud Drive and Photo Library.
On Android 8+ you can select any destination for recording, including SD card. On earlier OS versions we use /DCIM/LarixBroadcaster/ on the internal storage due to Android limitation.
Q13: How can I combine together the files with split recording?
You may install MP4Box and run command like “MP4Box -add input_file1.mp4 -cat input_file2.mp4 output_file.mp4”.
Q14: Why Larix Screencaster does not provide audio from my apps?
Android platform version 9 or earlier does not allow its applications to take audio from other applications. This is a security constraint, so Larix is not able to do that.
On Android 10, app audio recoding is allowed from apps which support external recording. To use this option, choose Audio -> Sound settings -> Media sounds.
iOS platform allows capturing the screen of user device and puts some limitations on audio. If you stream your screen, you can only use your microphone.
If currently opened application supports ReplayKit, then you’ll be able to stream its sound.
Q15: How can I remove white bar at the bottom of Larix Player for iOS?
If you’re an end user, you can follow this article to disable that white bar.
If you’re a subscriber of Player SDK for iOS, add the following to PlayerView
override var prefersHomeIndicatorAutoHidden: Bool {
return true
}
Larix Tuner web control service
You can effortlessly manage multiple Larix Broadcaster app instances from Larix Tuner web service to simplify remote production and enhance your streaming workflow. You can enable Larix Premium features via web service on any number of devices. Larix Tuner allows creating backups and restore settings of Larix Broadcaster. Visit Larix Tuner page for more details and sign up to simply your REMI.
Helpdesk
If you still have questions, contact us.