# Conflicts: # android/src/main/java/com/brentvatne/exoplayer/ReactExoplayerView.java
55 KiB
Table of Contents
- Installation
- Examples
- Usage
- iOS App Transport Security
- Audio Mixing
- Android Expansion File Usage
- Updating
- Contributing
Installation
Using npm:
npm install --save react-native-video
or using yarn:
yarn add react-native-video
Then follow the instructions for your platform to link react-native-video into your project:
iOS installation
iOS details
Standard Method
React Native 0.60 and above
Run npx pod-install
. Linking is not required in React Native 0.60 and above.
React Native 0.59 and below
Run react-native link react-native-video
to link the react-native-video library.
Enable Static Linking for dependencies in your ios project Podfile
Add use_frameworks! :linkage => :static
just under platform :ios
in your ios project Podfile.
See the example ios project for reference
Using CocoaPods (required to enable caching)
Setup your Podfile like it is described in the react-native documentation.
Depending on your requirements you have to choose between the two possible subpodspecs:
Video only:
pod 'Folly', :podspec => '../node_modules/react-native/third-party-podspecs/Folly.podspec'
+ `pod 'react-native-video', :path => '../node_modules/react-native-video/react-native-video.podspec'`
end
Video with caching (more info):
pod 'Folly', :podspec => '../node_modules/react-native/third-party-podspecs/Folly.podspec'
+ `pod 'react-native-video/VideoCaching', :path => '../node_modules/react-native-video/react-native-video.podspec'`
end
tvOS installation
tvOS details
react-native link react-native-video
doesn’t work properly with the tvOS target so we need to add the library manually.
First select your project in Xcode.
After that, select the tvOS target of your application and select « General » tab
Scroll to « Linked Frameworks and Libraries » and tap on the + button
Select RCTVideo-tvOS
Android installation
Android details
Linking is not required in React Native 0.60 and above.
If your project is using React Native < 0.60, run react-native link react-native-video
to link the react-native-video library.
Or if you have trouble, make the following additions to the given files manually:
android/settings.gradle
Add player source in build configuration
include ':react-native-video'
project(':react-native-video').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-video/android')
android/app/build.gradle
From version >= 5.0.0, you have to apply these changes:
dependencies {
...
compile project(':react-native-video')
+ implementation "androidx.appcompat:appcompat:1.0.0"
- implementation "com.android.support:appcompat-v7:${rootProject.ext.supportLibVersion}"
}
android/gradle.properties
Migrating to AndroidX (needs version >= 5.0.0):
android.useAndroidX=true
android.enableJetifier=true
MainApplication.java
If using com.facebook.react.PackageList to auto import native dependencies, there are no updates required here. Please see the android example project for more details. /examples/basic/android/app/src/main/java/com/videoplayer/MainApplication.java
For manual linking
On top, where imports are:
import com.brentvatne.react.ReactVideoPackage;
Add the ReactVideoPackage
class to your list of exported packages.
@Override
protected List<ReactPackage> getPackages() {
return Arrays.asList(
new MainReactPackage(),
new ReactVideoPackage()
);
}
Windows installation
Windows RNW C++/WinRT details
Autolinking
React Native Windows 0.63 and above
Autolinking should automatically add react-native-video to your app.
Manual Linking
React Native Windows 0.62
Make the following additions to the given files manually:
windows\myapp.sln
Add the ReactNativeVideoCPP project to your solution (eg. windows\myapp.sln
):
- Open your solution in Visual Studio 2019
- Right-click Solution icon in Solution Explorer > Add > Existing Project...
- Select
node_modules\react-native-video\windows\ReactNativeVideoCPP\ReactNativeVideoCPP.vcxproj
windows\myapp\myapp.vcxproj
Add a reference to ReactNativeVideoCPP to your main application project (eg. windows\myapp\myapp.vcxproj
):
- Open your solution in Visual Studio 2019
- Right-click main application project > Add > Reference...
- Check ReactNativeVideoCPP from Solution Projects
pch.h
Add #include "winrt/ReactNativeVideoCPP.h"
.
app.cpp
Add PackageProviders().Append(winrt::ReactNativeVideoCPP::ReactPackageProvider());
before InitializeComponent();
.
React Native Windows 0.61 and below
Follow the manual linking instuctions for React Native Windows 0.62 above, but substitute ReactNativeVideoCPP61 for ReactNativeVideoCPP.
Examples
Run yarn xbasic install
in the root directory before running any of the examples.
iOS Example
yarn xbasic ios
Android Example
yarn xbasic android
Windows Example
yarn xbasic windows
Usage
// Load the module
import Video from 'react-native-video';
// Within your render function, assuming you have a file called
// "background.mp4" in your project. You can include multiple videos
// on a single screen if you like.
<Video source={{uri: "background"}} // Can be a URL or a local file.
ref={(ref) => {
this.player = ref
}} // Store reference
onBuffer={this.onBuffer} // Callback when remote video is buffering
onError={this.videoError} // Callback when video cannot be loaded
style={styles.backgroundVideo} />
// Later on in your styles..
var styles = StyleSheet.create({
backgroundVideo: {
position: 'absolute',
top: 0,
left: 0,
bottom: 0,
right: 0,
},
});
Configurable props
Event props
Name | Plateforms Support |
---|---|
onAudioBecomingNoisy | Android, iOS |
onAudioTracks | Android |
onBandwidthUpdate | Android |
onBuffer | Android, iOS |
onEnd | All |
onExternalPlaybackChange | iOS |
onFullscreenPlayerWillPresent | Android, iOS |
onFullscreenPlayerDidPresent | Android, iOS |
onFullscreenPlayerWillDismiss | Android, iOS |
onFullscreenPlayerDidDismiss | Android, iOS |
onLoad | All |
onLoadStart | All |
onReadyForDisplay | Android, iOS, Web |
onPictureInPictureStatusChanged | iOS |
onPlaybackRateChange | All |
onProgress | All |
onSeek | Android, iOS, Windows UWP |
onRestoreUserInterfaceForPictureInPictureStop | iOS |
onTimedMetadata | Android, iOS |
onTextTracks | Android |
onVideoTracks | Android |
Methods
Name | Plateforms Support |
---|---|
dismissFullscreenPlayer | Android, iOS |
presentFullscreenPlayer | Android, iOS |
save | iOS |
restoreUserInterfaceForPictureInPictureStop | iOS |
seek | All |
Static methods
Name | Plateforms Support |
---|---|
getWidevineLevel | Android |
isCodecSupported | Android |
isHEVCSupported | Android |
Configurable props
allowsExternalPlayback
Indicates whether the player allows switching to external playback mode such as AirPlay or HDMI.
- true (default) - allow switching to external playback mode
- false - Don't allow switching to external playback mode
Platforms: iOS
audioOnly
Indicates whether the player should only play the audio track and instead of displaying the video track, show the poster instead.
- false (default) - Display the video as normal
- true - Show the poster and play the audio
For this to work, the poster prop must be set.
Platforms: all
automaticallyWaitsToMinimizeStalling
A Boolean value that indicates whether the player should automatically delay playback in order to minimize stalling. For clients linked against iOS 10.0 and later
- false - Immediately starts playback
- true (default) - Delays playback in order to minimize stalling
Platforms: iOS
backBufferDurationMs
The number of milliseconds of buffer to keep before the current position. This allows rewinding without rebuffering within that duration.
Platforms: Android
bufferConfig
Adjust the buffer settings. This prop takes an object with one or more of the properties listed below.
Property | Type | Description |
---|---|---|
minBufferMs | number | The default minimum duration of media that the player will attempt to ensure is buffered at all times, in milliseconds. |
maxBufferMs | number | The default maximum duration of media that the player will attempt to buffer, in milliseconds. |
bufferForPlaybackMs | number | The default duration of media that must be buffered for playback to start or resume following a user action such as a seek, in milliseconds. |
bufferForPlaybackAfterRebufferMs | number | The default duration of media that must be buffered for playback to resume after a rebuffer, in milliseconds. A rebuffer is defined to be caused by buffer depletion rather than a user action. |
maxHeapAllocationPercent | number | The percentage of available heap that the video can use to buffer, between 0 and 1 |
minBackBufferMemoryReservePercent | number | The percentage of available app memory at which during startup the back buffer will be disabled, between 0 and 1 |
minBufferMemoryReservePercent | number | The percentage of available app memory to keep in reserve that prevents buffer from using it, between 0 and 1 |
This prop should only be set when you are setting the source, changing it after the media is loaded will cause it to be reloaded.
Example with default values:
bufferConfig={{
minBufferMs: 15000,
maxBufferMs: 50000,
bufferForPlaybackMs: 2500,
bufferForPlaybackAfterRebufferMs: 5000
}}
Platforms: Android
currentPlaybackTime
When playing an HLS live stream with a EXT-X-PROGRAM-DATE-TIME
tag configured, then this property will contain the epoch value in msec.
Platforms: Android, iOS
controls
Determines whether to show player controls.
- false (default) - Don't show player controls
- true - Show player controls
Note on iOS, controls are always shown when in fullscreen mode.
contentStartTime
The start time in ms for SSAI content. This determines at what time to load the video info like resolutions. Use this only when you have SSAI stream where ads resolution is not the same as content resolution.
Note on Android, native controls are available by default. If needed, you can also add your controls or use a package like [react-native-video-controls].
Platforms: Android, iOS
disableFocus
Determines whether video audio should override background music/audio in Android devices.
- false (default) - Override background audio/music
- true - Let background audio/music from other apps play
Note: Allows multiple videos to play if set to true
. If false
, when one video is playing and another is started, the first video will be paused.
Platforms: Android
disableDisconnectError
Determines if the player needs to throw an error when connection is lost or not
- false (default) - Player will throw an error when connection is lost
- true - Player will keep trying to buffer when network connect is lost
Platforms: Android
DRM
To setup DRM please follow this guide
Platforms: Android, iOS
filter
Add video filter
- FilterType.NONE (default) - No Filter
- FilterType.INVERT - CIColorInvert
- FilterType.MONOCHROME - CIColorMonochrome
- FilterType.POSTERIZE - CIColorPosterize
- FilterType.FALSE - CIFalseColor
- FilterType.MAXIMUMCOMPONENT - CIMaximumComponent
- FilterType.MINIMUMCOMPONENT - CIMinimumComponent
- FilterType.CHROME - CIPhotoEffectChrome
- FilterType.FADE - CIPhotoEffectFade
- FilterType.INSTANT - CIPhotoEffectInstant
- FilterType.MONO - CIPhotoEffectMono
- FilterType.NOIR - CIPhotoEffectNoir
- FilterType.PROCESS - CIPhotoEffectProcess
- FilterType.TONAL - CIPhotoEffectTonal
- FilterType.TRANSFER - CIPhotoEffectTransfer
- FilterType.SEPIA - CISepiaTone
For more details on these filters refer to the iOS docs.
Notes:
- Using a filter can impact CPU usage. A workaround is to save the video with the filter and then load the saved video.
- Video filter is currently not supported on HLS playlists.
filterEnabled
must be set totrue
Platforms: iOS
filterEnabled
Enable video filter.
- false (default) - Don't enable filter
- true - Enable filter
Platforms: iOS
fullscreen
Controls whether the player enters fullscreen on play.
- false (default) - Don't display the video in fullscreen
- true - Display the video in fullscreen
Platforms: iOS
fullscreenAutorotate
If a preferred fullscreenOrientation is set, causes the video to rotate to that orientation but permits rotation of the screen to orientation held by user. Defaults to TRUE.
Platforms: iOS
fullscreenOrientation
- all (default) -
- landscape
- portrait
Platforms: iOS
headers
Pass headers to the HTTP client. Can be used for authorization. Headers must be a part of the source object.
Example:
source={{
uri: "https://www.example.com/video.mp4",
headers: {
Authorization: 'bearer some-token-value',
'X-Custom-Header': 'some value'
}
}}
Platforms: Android
hideShutterView
Controls whether the ExoPlayer shutter view (black screen while loading) is enabled.
- false (default) - Show shutter view
- true - Hide shutter view
Platforms: Android
ignoreSilentSwitch
Controls the iOS silent switch behavior
- "inherit" (default) - Use the default AVPlayer behavior
- "ignore" - Play audio even if the silent switch is set
- "obey" - Don't play audio if the silent switch is set
Platforms: iOS
maxBitRate
Sets the desired limit, in bits per second, of network bandwidth consumption when multiple video streams are available for a playlist.
Default: 0. Don't limit the maxBitRate.
Example:
maxBitRate={2000000} // 2 megabits
Platforms: Android, iOS
minLoadRetryCount
Sets the minimum number of times to retry loading data before failing and reporting an error to the application. Useful to recover from transient internet failures.
Default: 3. Retry 3 times.
Example:
minLoadRetryCount={5} // retry 5 times
Platforms: Android
mixWithOthers
Controls how Audio mix with other apps.
- "inherit" (default) - Use the default AVPlayer behavior
- "mix" - Audio from this video mixes with audio from other apps.
- "duck" - Reduces the volume of other apps while audio from this video plays.
Platforms: iOS
muted
Controls whether the audio is muted
- false (default) - Don't mute audio
- true - Mute audio
Platforms: all
paused
Controls whether the media is paused
- false (default) - Don't pause the media
- true - Pause the media
Platforms: all
pictureInPicture
Determine whether the media should played as picture in picture.
- false (default) - Don't not play as picture in picture
- true - Play the media as picture in picture
Platforms: iOS
playInBackground
Determine whether the media should continue playing while the app is in the background. This allows customers to continue listening to the audio.
- false (default) - Don't continue playing the media
- true - Continue playing the media
To use this feature on iOS, you must:
- Enable Background Audio in your Xcode project
- Set the ignoreSilentSwitch prop to "ignore"
Platforms: Android, iOS
playWhenInactive
Determine whether the media should continue playing when notifications or the Control Center are in front of the video.
- false (default) - Don't continue playing the media
- true - Continue playing the media
Platforms: iOS
poster
An image to display while the video is loading
Value: string with a URL for the poster, e.g. "https://baconmockup.com/300/200/"
Platforms: all
posterResizeMode
Determines how to resize the poster image when the frame doesn't match the raw video dimensions.
- "contain" (default) - Scale the image uniformly (maintain the image's aspect ratio) so that both dimensions (width and height) of the image will be equal to or less than the corresponding dimension of the view (minus padding).
- "center" - Center the image in the view along both dimensions. If the image is larger than the view, scale it down uniformly so that it is contained in the view.
- "cover" - Scale the image uniformly (maintain the image's aspect ratio) so that both dimensions (width and height) of the image will be equal to or larger than the corresponding dimension of the view (minus padding).
- "none" - Don't apply resize
- "repeat" - Repeat the image to cover the frame of the view. The image will keep its size and aspect ratio. (iOS only)
- "stretch" - Scale width and height independently, This may change the aspect ratio of the src.
Platforms: all
preferredForwardBufferDuration
The duration the player should buffer media from the network ahead of the playhead to guard against playback disruption. Sets the preferredForwardBufferDuration instance property on AVPlayerItem.
Default: 0
Platforms: iOS
preventsDisplaySleepDuringVideoPlayback
Controls whether or not the display should be allowed to sleep while playing the video. Default is not to allow display to sleep.
Default: true
Platforms: iOS, Android
progressUpdateInterval
Delay in milliseconds between onProgress events in milliseconds.
Default: 250.0
Platforms: all
rate
Speed at which the media should play.
- 0.0 - Pauses the video
- 1.0 - Play at normal speed
- Other values - Slow down or speed up playback
Platforms: all
repeat
Determine whether to repeat the video when the end is reached
- false (default) - Don't repeat the video
- true - Repeat the video
Platforms: all
onAudioTracks
Callback function that is called when audio tracks change
Payload:
Property | Type | Description |
---|---|---|
index | number | Internal track ID |
title | string | Descriptive name for the track |
language | string | 2 letter ISO 639-1 code representing the language |
bitrate | number | bitrate of track |
type | string | Mime type of track |
selected | boolean | true if track is playing |
Example:
{
audioTracks: [
{ language: 'es', title: 'Spanish', type: 'audio/mpeg', index: 0, selected: true },
{ language: 'en', title: 'English', type: 'audio/mpeg', index: 1 }
],
}
Platforms: Android
reportBandwidth
Determine whether to generate onBandwidthUpdate events. This is needed due to the high frequency of these events on ExoPlayer.
- false (default) - Don't generate onBandwidthUpdate events
- true - Generate onBandwidthUpdate events
Platforms: Android
resizeMode
Determines how to resize the video when the frame doesn't match the raw video dimensions.
- "none" (default) - Don't apply resize
- "contain" - Scale the video uniformly (maintain the video's aspect ratio) so that both dimensions (width and height) of the video will be equal to or less than the corresponding dimension of the view (minus padding).
- "cover" - Scale the video uniformly (maintain the video's aspect ratio) so that both dimensions (width and height) of the image will be equal to or larger than the corresponding dimension of the view (minus padding).
- "stretch" - Scale width and height independently, This may change the aspect ratio of the src.
Platforms: Android, iOS, Windows UWP
selectedAudioTrack
Configure which audio track, if any, is played.
selectedAudioTrack={{
type: Type,
value: Value
}}
Example:
selectedAudioTrack={{
type: "title",
value: "Dubbing"
}}
Type | Value | Description |
---|---|---|
"system" (default) | N/A | Play the audio track that matches the system language. If none match, play the first track. |
"disabled" | N/A | Turn off audio |
"title" | string | Play the audio track with the title specified as the Value, e.g. "French" |
"language" | string | Play the audio track with the language specified as the Value, e.g. "fr" |
"index" | number | Play the audio track with the index specified as the value, e.g. 0 |
If a track matching the specified Type (and Value if appropriate) is unavailable, the first audio track will be played. If multiple tracks match the criteria, the first match will be used.
Platforms: Android, iOS
selectedTextTrack
Configure which text track (caption or subtitle), if any, is shown.
selectedTextTrack={{
type: Type,
value: Value
}}
Example:
selectedTextTrack={{
type: "title",
value: "English Subtitles"
}}
Type | Value | Description |
---|---|---|
"system" (default) | N/A | Display captions only if the system preference for captions is enabled |
"disabled" | N/A | Don't display a text track |
"title" | string | Display the text track with the title specified as the Value, e.g. "French 1" |
"language" | string | Display the text track with the language specified as the Value, e.g. "fr" |
"index" | number | Display the text track with the index specified as the value, e.g. 0 |
Both iOS & Android (only 4.4 and higher) offer Settings to enable Captions for hearing impaired people. If "system" is selected and the Captions Setting is enabled, iOS/Android will look for a caption that matches that customer's language and display it.
If a track matching the specified Type (and Value if appropriate) is unavailable, no text track will be displayed. If multiple tracks match the criteria, the first match will be used.
Platforms: Android, iOS
selectedVideoTrack
Configure which video track should be played. By default, the player uses Adaptive Bitrate Streaming to automatically select the stream it thinks will perform best based on available bandwidth.
selectedVideoTrack={{
type: Type,
value: Value
}}
Example:
selectedVideoTrack={{
type: "resolution",
value: 480
}}
Type | Value | Description |
---|---|---|
"auto" (default) | N/A | Let the player determine which track to play using ABR |
"disabled" | N/A | Turn off video |
"resolution" | number | Play the video track with the height specified, e.g. 480 for the 480p stream |
"index" | number | Play the video track with the index specified as the value, e.g. 0 |
If a track matching the specified Type (and Value if appropriate) is unavailable, ABR will be used.
Platforms: Android
source
Sets the media source. You can pass an asset loaded via require or an object with a uri.
Setting the source will trigger the player to attempt to load the provided media with all other given props. Please be sure that all props are provided before/at the same time as setting the source.
Rendering the player component with a null source will init the player, and start playing once a source value is provided.
Providing a null source value after loading a previous source will stop playback, and clear out the previous source content.
The docs for this prop are incomplete and will be updated as each option is investigated and tested.
Asset loaded via require
Example:
const sintel = require('./sintel.mp4');
source={sintel}
URI string
A number of URI schemes are supported by passing an object with a uri
attribute.
Web address (http://, https://)
Example:
source={{uri: 'https://www.sample-videos.com/video/mp4/720/big_buck_bunny_720p_10mb.mp4' }}
Platforms: all
File path (file://)
Example:
source={{ uri: 'file:///sdcard/Movies/sintel.mp4' }}
Note: Your app will need to request permission to read external storage if you're accessing a file outside your app.
Platforms: Android, possibly others
iPod Library (ipod-library://)
Path to a sound file in your iTunes library. Typically shared from iTunes to your app.
Example:
source={{ uri: 'ipod-library:///path/to/music.mp3' }}
Note: Using this feature adding an entry for NSAppleMusicUsageDescription to your Info.plist file as described here
Platforms: iOS
Explicit mimetype for the stream
Provide a member type
with value (mpd
/m3u8
/ism
) inside the source object.
Sometimes is needed when URL extension does not match with the mimetype that you are expecting, as seen on the next example. (Extension is .ism -smooth streaming- but file served is on format mpd -mpeg dash-)
Example:
source={{ uri: 'http://host-serving-a-type-different-than-the-extension.ism/manifest(format=mpd-time-csf)',
type: 'mpd' }}
Other protocols
The following other types are supported on some platforms, but aren't fully documented yet:
content://, ms-appx://, ms-appdata://, assets-library://
subtitleStyle
Property | Description | Platforms |
---|---|---|
fontSizeTrack | Adjust the font size of the subtitles. Default: font size of the device | Android |
paddingTop | Adjust the top padding of the subtitles. Default: 0 | Android |
paddingBottom | Adjust the bottom padding of the subtitles. Default: 0 | Android |
paddingLeft | Adjust the left padding of the subtitles. Default: 0 | Android |
paddingRight | Adjust the right padding of the subtitles. Default: 0 | Android |
Example:
subtitleStyle={{ paddingBottom: 50, fontSize: 20 }}
textTracks
Load one or more "sidecar" text tracks. This takes an array of objects representing each track. Each object should have the format:
Property | Description |
---|---|
title | Descriptive name for the track |
language | 2 letter ISO 639-1 code representing the language |
type | Mime type of the track * TextTrackType.SRT - SubRip (.srt) * TextTrackType.TTML - TTML (.ttml) * TextTrackType.VTT - WebVTT (.vtt) iOS only supports VTT, Android supports all 3 |
uri | URL for the text track. Currently, only tracks hosted on a webserver are supported |
On iOS, sidecar text tracks are only supported for individual files, not HLS playlists. For HLS, you should include the text tracks as part of the playlist.
Note: Due to iOS limitations, sidecar text tracks are not compatible with Airplay. If textTracks are specified, AirPlay support will be automatically disabled.
Example:
import { TextTrackType }, Video from 'react-native-video';
textTracks={[
{
title: "English CC",
language: "en",
type: TextTrackType.VTT, // "text/vtt"
uri: "https://bitdash-a.akamaihd.net/content/sintel/subtitles/subtitles_en.vtt"
},
{
title: "Spanish Subtitles",
language: "es",
type: TextTrackType.SRT, // "application/x-subrip"
uri: "https://durian.blender.org/wp-content/content/subtitles/sintel_es.srt"
}
]}
Platforms: Android, iOS
trackId
Configure an identifier for the video stream to link the playback context to the events emitted.
Platforms: Android
useTextureView
Controls whether to output to a TextureView or SurfaceView.
SurfaceView is more efficient and provides better performance but has two limitations:
- It can't be animated, transformed or scaled
- You can't overlay multiple SurfaceViews
useTextureView can only be set at same time you're setting the source.
- true (default) - Use a TextureView
- false - Use a SurfaceView
Platforms: Android
useSecureView
Force the output to a SurfaceView and enables the secure surface.
This will override useTextureView flag.
SurfaceView is is the only one that can be labeled as secure.
- true - Use security
- false (default) - Do not use security
Platforms: Android
volume
Adjust the volume.
- 1.0 (default) - Play at full volume
- 0.0 - Mute the audio
- Other values - Reduce volume
Platforms: all
localSourceEncryptionKeyScheme
Set the url scheme for stream encryption key for local assets
Type: String
Example:
localSourceEncryptionKeyScheme="my-offline-key"
Platforms: iOS
Event props
onAudioBecomingNoisy
Callback function that is called when the audio is about to become 'noisy' due to a change in audio outputs. Typically this is called when audio output is being switched from an external source like headphones back to the internal speaker. It's a good idea to pause the media when this happens so the speaker doesn't start blasting sound.
Payload: none
Platforms: Android, iOS
onBandwidthUpdate
Callback function that is called when the available bandwidth changes.
Payload:
Property | Type | Description |
---|---|---|
bitrate | number | The estimated bitrate in bits/sec |
Example:
{
bitrate: 1000000
}
Note: On Android, you must set the reportBandwidth prop to enable this event. This is due to the high volume of events generated.
Platforms: Android
onBuffer
Callback function that is called when the player buffers.
Payload:
Property | Type | Description |
---|---|---|
isBuffering | boolean | Boolean indicating whether buffering is active |
Example:
{
isBuffering: true
}
Platforms: Android, iOS
onEnd
Callback function that is called when the player reaches the end of the media.
Payload: none
Platforms: all
onExternalPlaybackChange
Callback function that is called when external playback mode for current playing video has changed. Mostly useful when connecting/disconnecting to Apple TV – it's called on connection/disconnection.
Payload:
Property | Type | Description |
---|---|---|
isExternalPlaybackActive | boolean | Boolean indicating whether external playback mode is active |
Example:
{
isExternalPlaybackActive: true
}
Platforms: iOS
onFullscreenPlayerWillPresent
Callback function that is called when the player is about to enter fullscreen mode.
Payload: none
Platforms: Android, iOS
onFullscreenPlayerDidPresent
Callback function that is called when the player has entered fullscreen mode.
Payload: none
Platforms: Android, iOS
onFullscreenPlayerWillDismiss
Callback function that is called when the player is about to exit fullscreen mode.
Payload: none
Platforms: Android, iOS
onFullscreenPlayerDidDismiss
Callback function that is called when the player has exited fullscreen mode.
Payload: none
Platforms: Android, iOS
onLoad
Callback function that is called when the media is loaded and ready to play.
Payload:
Property | Type | Description |
---|---|---|
currentPosition | number | Time in seconds where the media will start |
duration | number | Length of the media in seconds |
naturalSize | object | Properties: * width - Width in pixels that the video was encoded at * height - Height in pixels that the video was encoded at * orientation - "portrait" or "landscape" |
audioTracks | array | An array of audio track info objects with the following properties: * index - Index number * title - Description of the track * language - 2 letter ISO 639-1 or 3 letter ISO639-2 language code * type - Mime type of track |
textTracks | array | An array of text track info objects with the following properties: * index - Index number * title - Description of the track * language - 2 letter ISO 639-1 or 3 letter ISO 639-2 language code * type - Mime type of track |
videoTracks | array | An array of video track info objects with the following properties: * trackId - ID for the track * bitrate - Bit rate in bits per second * codecs - Comma separated list of codecs * height - Height of the video * width - Width of the video |
Example:
{
canPlaySlowForward: true,
canPlayReverse: false,
canPlaySlowReverse: false,
canPlayFastForward: false,
canStepForward: false,
canStepBackward: false,
currentTime: 0,
duration: 5910.208984375,
naturalSize: {
height: 1080
orientation: 'landscape'
width: '1920'
},
audioTracks: [
{ language: 'es', title: 'Spanish', type: 'audio/mpeg', index: 0 },
{ language: 'en', title: 'English', type: 'audio/mpeg', index: 1 }
],
textTracks: [
{ title: '#1 French', language: 'fr', index: 0, type: 'text/vtt' },
{ title: '#2 English CC', language: 'en', index: 1, type: 'text/vtt' },
{ title: '#3 English Director Commentary', language: 'en', index: 2, type: 'text/vtt' }
],
videoTracks: [
{ bitrate: 3987904, codecs: "avc1.640028", height: 720, trackId: "f1-v1-x3", width: 1280 },
{ bitrate: 7981888, codecs: "avc1.640028", height: 1080, trackId: "f2-v1-x3", width: 1920 },
{ bitrate: 1994979, codecs: "avc1.4d401f", height: 480, trackId: "f3-v1-x3", width: 848 }
]
}
Platforms: all
onLoadStart
Callback function that is called when the media starts loading.
Payload:
Property | Description |
---|---|
isNetwork | boolean |
type | string |
uri | string |
Example:
{
isNetwork: true,
type: '',
uri: 'https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8'
}
Platforms: all
onPlaybackStateChanged
Callback function that is called when the playback state changes.
Payload:
Property | Description |
---|---|
isPlaying | boolean |
Example:
{
isPlaying: true,
}
Platforms: Android
onReadyForDisplay
Callback function that is called when the first video frame is ready for display. This is when the poster is removed.
Payload: none
- iOS: readyForDisplay
- Android STATE_READY
Platforms: Android, iOS, Web
onPictureInPictureStatusChanged
Callback function that is called when picture in picture becomes active or inactive.
Property | Type | Description |
---|---|---|
isActive | boolean | Boolean indicating whether picture in picture is active |
Example:
{
isActive: true
}
Platforms: iOS
onPlaybackRateChange
Callback function that is called when the rate of playback changes - either paused or starts/resumes.
Property | Type | Description |
---|---|---|
playbackRate | number | 0 when playback is paused, 1 when playing at normal speed. Other values when playback is slowed down or sped up |
Example:
{
playbackRate: 0, // indicates paused
}
Platforms: all
onProgress
Callback function that is called every progressUpdateInterval milliseconds with info about which position the media is currently playing.
Property | Type | Description |
---|---|---|
currentTime | number | Current position in seconds |
playableDuration | number | Position to where the media can be played to using just the buffer in seconds |
seekableDuration | number | Position to where the media can be seeked to in seconds. Typically, the total length of the media |
Example:
{
currentTime: 5.2,
playableDuration: 34.6,
seekableDuration: 888
}
Platforms: all
onSeek
Callback function that is called when a seek completes.
Payload:
Property | Type | Description |
---|---|---|
currentTime | number | The current time after the seek |
seekTime | number | The requested time |
Example:
{
currentTime: 100.5
seekTime: 100
}
Both the currentTime & seekTime are reported because the video player may not seek to the exact requested position in order to improve seek performance.
Platforms: Android, iOS, Windows UWP
onRestoreUserInterfaceForPictureInPictureStop
Callback function that corresponds to Apple's restoreUserInterfaceForPictureInPictureStopWithCompletionHandler
. Call restoreUserInterfaceForPictureInPictureStopCompleted
inside of this function when done restoring the user interface.
Payload: none
Platforms: iOS
onTimedMetadata
Callback function that is called when timed metadata becomes available
Payload:
Property | Type | Description |
---|---|---|
metadata | array | Array of metadata objects |
Example:
{
metadata: [
{ value: 'Streaming Encoder', identifier: 'TRSN' },
{ value: 'Internet Stream', identifier: 'TRSO' },
{ value: 'Any Time You Like', identifier: 'TIT2' }
]
}
Platforms: Android, iOS
onTextTracks
Callback function that is called when text tracks change
Payload:
Property | Type | Description |
---|---|---|
index | number | Internal track ID |
title | string | Descriptive name for the track |
language | string | 2 letter ISO 639-1 code representing the language |
type | string | Mime type of the track * TextTrackType.SRT - SubRip (.srt) * TextTrackType.TTML - TTML (.ttml) * TextTrackType.VTT - WebVTT (.vtt) iOS only supports VTT, Android supports all 3 |
selected | boolean | true if track is playing |
Example:
{
textTracks: [
{
index: 0,
title: 'Any Time You Like',
type: 'srt',
selected: true
}
]
}
Platforms: Android
onVideoTracks
Callback function that is called when video tracks change
Payload:
Property | Type | Description |
---|---|---|
trackId | number | Internal track ID |
codecs | string | MimeType of codec used for this track |
width | number | Track width |
height | number | Track height |
bitrate | number | Bitrate in bps |
selected | boolean | true if track is selected for playing |
Example:
{
videoTracks: [
{
trackId: 0,
codecs: 'video/mp4',
width: 1920,
height: 1080,
bitrate: 10000,
selected: true
}
]
}
Platforms: Android
Methods
Methods operate on a ref to the Video element. You can create a ref using code like:
return (
<Video source={...}
ref={ref => (this.player = ref)} />
);
dismissFullscreenPlayer
dismissFullscreenPlayer()
Take the player out of fullscreen mode.
Example:
this.player.dismissFullscreenPlayer();
Platforms: Android, iOS
presentFullscreenPlayer
presentFullscreenPlayer()
Put the player in fullscreen mode.
On iOS, this displays the video in a fullscreen view controller with controls.
On Android, this puts the navigation controls in fullscreen mode. It is not a complete fullscreen implementation, so you will still need to apply a style that makes the width and height match your screen dimensions to get a fullscreen video.
Example:
this.player.presentFullscreenPlayer();
Platforms: Android, iOS
save
save(): Promise
Save video to your Photos with current filter prop. Returns promise.
Example:
let response = await this.player.save();
let path = response.uri;
Notes:
- Currently only supports highest quality export
- Currently only supports MP4 export
- Currently only supports exporting to user's cache directory with a generated UUID filename.
- User will need to remove the saved video through their Photos app
- Works with cached videos as well. (Checkout video-caching example)
- If the video is has not began buffering (e.g. there is no internet connection) then the save function will throw an error.
- If the video is buffering then the save function promise will return after the video has finished buffering and processing.
Future:
- Will support multiple qualities through options
- Will support more formats in the future through options
- Will support custom directory and file name through options
Platforms: iOS
restoreUserInterfaceForPictureInPictureStopCompleted
restoreUserInterfaceForPictureInPictureStopCompleted(restored)
This function corresponds to the completion handler in Apple's restoreUserInterfaceForPictureInPictureStop. IMPORTANT: This function must be called after onRestoreUserInterfaceForPictureInPictureStop
is called.
Example:
this.player.restoreUserInterfaceForPictureInPictureStopCompleted(true);
Platforms: iOS
seek()
seek(seconds)
Seek to the specified position represented by seconds. seconds is a float value.
seek()
can only be called after the onLoad
event has fired. Once completed, the onSeek event will be called.
Example:
this.player.seek(200); // Seek to 3 minutes, 20 seconds
Platforms: all
Exact seek
By default iOS seeks within 100 milliseconds of the target position. If you need more accuracy, you can use the seek with tolerance method:
seek(seconds, tolerance)
tolerance is the max distance in milliseconds from the seconds position that's allowed. Using a more exact tolerance can cause seeks to take longer. If you want to seek exactly, set tolerance to 0.
Example:
this.player.seek(120, 50); // Seek to 2 minutes with +/- 50 milliseconds accuracy
Platforms: iOS
Static methods
Video Decoding capabilities
A module embed in ReactNativeVideo allow to query device supported feature. To use it include the module as following:
import { VideoDecoderProperties } from '@ifs/react-native-video-enhanced'
Platforms: Android
getWidevineLevel
Indicates whether the widevine level supported by device.
Possible results:
- 0 - unable to determine widevine support (typically not supported)
- 1, 2, 3 - Widevine level supported
Platforms: Android
Example:
VideoDecoderProperties.getWidevineLevel().then((widevineLevel) => {
...
}
isCodecSupported
Indicates whether the provided codec is supported level supported by device.
parameters:
- mimetype: mime type of codec to query
- width, height: resolution to query
Possible results:
- true - codec supported
- false - codec is not supported
Example:
VideoDecoderProperties.isCodecSupported('video/avc', 1920, 1080).then(
...
}
Platforms: Android
isHEVCSupported
Helper which Indicates whether the provided HEVC/1920*1080 is supported level supported by device. It uses isCodecSupported internally.
Example:
VideoDecoderProperties.isHEVCSupported().then((hevcSupported) => {
...
}
iOS App Transport Security
- By default, iOS will only load encrypted (https) urls. If you want to load content from an unencrypted (http) source, you will need to modify your Info.plist file and add the following entry:
For more detailed info check this article
Audio Mixing
At some point in the future, react-native-video will include an Audio Manager for configuring how videos mix with other apps playing sounds on the device.
On iOS, if you would like to allow other apps to play music over your video component, make the following change:
AppDelegate.m
#import <AVFoundation/AVFoundation.h> // import
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
...
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:nil]; // allow
...
}
You can also use the ignoreSilentSwitch prop.
Android Expansion File Usage
Expansions files allow you to ship assets that exceed the 100MB apk size limit and don't need to be updated each time you push an app update.
This only supports mp4 files and they must not be compressed. Example command line for preventing compression:
zip -r -n .mp4 *.mp4 player.video.example.com
// Within your render function, assuming you have a file called
// "background.mp4" in your expansion file. Just add your main and (if applicable) patch version
<Video source={{uri: "background", mainVer: 1, patchVer: 0}} // Looks for .mp4 file (background.mp4) in the given expansion version.
resizeMode="cover" // Fill the whole screen at aspect ratio.
style={styles.backgroundVideo} />
Load files with the RN Asset System
The asset system introduced in RN 0.14
allows loading image resources shared across iOS and Android without touching native code. As of RN 0.31
the same is true of mp4 video assets for Android. As of RN 0.33
iOS is also supported. Requires react-native-video@0.9.0
.
<Video
source={require('../assets/video/turntable.mp4')}
/>
Play in background on iOS
To enable audio to play in background on iOS the audio session needs to be set to AVAudioSessionCategoryPlayback
. See [Apple documentation][3] for additional details. (NOTE: there is now a ticket to expose this as a prop )
Examples
-
See an [Example integration][1] in
react-native-login
note that this example uses an older version of this library, before we usedexport default
-- if you userequire
you will need to dorequire('react-native-video').default
as per instructions above. -
Try the included [VideoPlayer example][2] yourself:
git clone git@github.com:react-native-community/react-native-video.git cd react-native-video/example npm install open ios/VideoPlayer.xcodeproj
Then
Cmd+R
to start the React Packager, build and run the project in the simulator. -
Lumpen Radio contains another example integration using local files and full screen background video.
Updating
Version 6.0.0
iOS
In your project Podfile add support for static dependency linking. This is required to support the new Promises subdependency in the iOS swift conversion.
Add use_frameworks! :linkage => :static
just under platform :ios
in your ios project Podfile.
See the example ios project for reference
Version 5.0.0
Probably you want to update your gradle version:
gradle-wrapper.properties
- distributionUrl=https\://services.gradle.org/distributions/gradle-3.3-all.zip
+ distributionUrl=https\://services.gradle.org/distributions/gradle-5.1.1-all.zip
android/app/build.gradle
From version >= 5.0.0, you have to apply this changes:
dependencies {
...
compile project(':react-native-video')
+ implementation "androidx.appcompat:appcompat:1.0.0"
- implementation "com.android.support:appcompat-v7:${rootProject.ext.supportLibVersion}"
}
android/gradle.properties
Migrating to AndroidX (needs version >= 5.0.0):
android.useAndroidX=true
android.enableJetifier=true
Version 4.0.0
Gradle 3 and target SDK 26 requirement
In order to support ExoPlayer 2.9.0, you must use version 3 or higher of the Gradle plugin. This is included by default in React Native 0.57.
ExoPlayer 2.9.0 Java 1.8 requirement
ExoPlayer 2.9.0 uses some Java 1.8 features, so you may need to enable support for Java 1.8 in your app/build.gradle file. If you get an error, compiling with ExoPlayer like:
Default interface methods are only supported starting with Android N (--min-api 24)
Add the following to your app/build.gradle file:
android {
... // Various other settings go here
compileOptions {
targetCompatibility JavaVersion.VERSION_1_8
}
}
ExoPlayer no longer detaches
When using a router like the react-navigation TabNavigator, switching between tab routes would previously cause ExoPlayer to detach causing the video player to pause. We now don't detach the view, allowing the video to continue playing in a background tab. This matches the behavior for iOS.
useTextureView now defaults to true
The SurfaceView, which ExoPlayer has been using by default has a number of quirks that people are unaware of and often cause issues. This includes not supporting animations or scaling. It also causes strange behavior if you overlay two videos on top of each other, because the SurfaceView will punch a hole through other views. Since TextureView doesn't have these issues and behaves in the way most developers expect, it makes sense to make it the default.
TextureView is not as fast as SurfaceView, so you may still want to enable SurfaceView support. To do this, you can set useTextureView={false}
.
Version 3.0.0
All platforms now auto-play
Previously, on Android ExoPlayer if the paused prop was not set, the media would not automatically start playing. The only way it would work was if you set paused={false}
. This has been changed to automatically play if paused is not set so that the behavior is consistent across platforms.
All platforms now keep their paused state when returning from the background
Previously, on Android MediaPlayer if you setup an AppState event when the app went into the background and set a paused prop so that when you returned to the app the video would be paused it would be ignored.
Note, Windows does not have a concept of an app going into the background, so this doesn't apply there.
Use Android target SDK 27 by default
Version 3.0 updates the Android build tools and SDK to version 27. React Native is in the process of switchting over to SDK 27 in preparation for Google's requirement that new Android apps use SDK 26 by August 2018.
You will either need to install the version 27 SDK and version 27.0.3 buildtools or modify your build.gradle file to configure react-native-video to use the same build settings as the rest of your app as described below.
Using app build settings
You will need to create a project.ext
section in the top-level build.gradle file (not app/build.gradle). Fill in the values from the example below using the values found in your app/build.gradle file.
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
... // Various other settings go here
}
allprojects {
... // Various other settings go here
project.ext {
compileSdkVersion = 31
buildToolsVersion = "30.0.2"
minSdkVersion = 21
targetSdkVersion = 22
}
}
If you encounter an error Could not find com.android.support:support-annotations:27.0.0.
reinstall your Android Support Repository.
Black Screen on Release build (Android)
If your video work on Debug mode, but on Release you see only black screen, please, check the link to your video. If you use 'http' protocol there, you will need to add next string to your AndroidManifest.xml file.
<application
...
android:usesCleartextTraffic="true"
>