Build a powerful video & audio calling app using VideoSDK. The solution that helps you build, scale, & innovate.
Why Use VideoSDK for Your React Native App?
Let's simplify the process of building a cross-platform video & audio calling app in React Native. Video calling has become an essential requirement, no doubt about that because the feature is being used in ways never seen before from Telehealth apps to live commerce but it's only the start. So we can't assume the kind of app you will be building, but whatever the app is we provide you the best Video SDK solution which is feature-rich, easy to implement, robust, & not to say free to use!
Key Features to Enhance Your React Native Video Calling App
React Native is a great choice for building cross-platform apps. However, providing the exact same features for both platforms becomes difficult. But at VideoSDK, we have you covered fam! We provide some great features you can right away that work both for React native Android and React native iOS screen sharing on devices, which is unheard of for iOS! We provide 20+ features, so let your futuristic mind take over to create the best video & audio calling app in React Native. Here's the list of features you can add to your React Native application.
Now we are all set, so let's get started with the tutorial!
Getting Started with Your React Video Calling App
The below steps will give you all the information to quickly integrate video SDK into your app. Please follow along carefully & if you face any difficulty then let us know immediately on discord and we will help you right away.
Prerequisites for Installing VideoSDK
- Node.js v12+
- NPM v6+ (comes installed with newer Node versions)
- Android Studio or Xcode installed
- A token from the VideoSDK dashboard
Create a new React Native app
Let's start by creating a new React Native app using the command:
$ npx react-native init AppName
Install Video SDK
Install the Video SDK by following the below command. Make sure you are in your project directory before you run this command.
$ npm install "@videosdk.live/react-native-sdk"
Project Structure
Project Configuration
You need to configure your project for Android and iOS to make sure the app runs smoothly.
Step-by-Step Guide to Creating a New React Native App
Android Setup
Step 1: Add the required permission in the AndroidManifest.xml file.
<manifest
xmlns:android="http://schemas.android.com/apk/res/android"
package="com.cool.app"
>
<!-- Give all the required permissions to app -->
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<!-- Needed to communicate with already-paired Bluetooth devices. (Legacy up to Android 11) -->
<uses-permission
android:name="android.permission.BLUETOOTH"
android:maxSdkVersion="30" />
<uses-permission
android:name="android.permission.BLUETOOTH_ADMIN"
android:maxSdkVersion="30" />
<!-- Needed to communicate with already-paired Bluetooth devices. (Android 12 upwards)-->
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE"/>
<uses-permission android:name="android.permission.WAKE_LOCK" />
<application>
<meta-data
android:name="live.videosdk.rnfgservice.notification_channel_name"
android:value="Meeting Notification"
/>
<meta-data
android:name="live.videosdk.rnfgservice.notification_channel_description"
android:value="Whenever meeting started notification will appear."
/>
<meta-data
android:name="live.videosdk.rnfgservice.notification_color"
android:resource="@color/red"
/>
<service android:name="live.videosdk.rnfgservice.ForegroundService" android:foregroundServiceType="mediaProjection"></service>
<service android:name="live.videosdk.rnfgservice.ForegroundServiceTask"></service>
</application>
</manifest>
Step 2: Link couple of internal library dependencies in android/app/build.gradle file
dependencies {
compile project(':rnfgservice')
compile project(':rnwebrtc')
compile project(':rnincallmanager')
}
Include dependencies in android/settings.gradle
include ':rnwebrtc'
project(':rnwebrtc').projectDir = new File(rootProject.projectDir, '../node_modules/@videosdk.live/react-native-webrtc/android')
include ':rnincallmanager'
project(':rnincallmanager').projectDir = new File(rootProject.projectDir, '../node_modules/@videosdk.live/react-native-incallmanager/android')
include ':rnfgservice'
project(':rnfgservice').projectDir = new File(rootProject.projectDir, '../node_modules/@videosdk.live/react-native-foreground-service/android')
Update MainApplication.java to use InCall manager and run some foreground services.
import live.videosdk.rnfgservice.ForegroundServicePackage;
import live.videosdk.rnincallmanager.InCallManagerPackage;
import live.videosdk.rnwebrtc.WebRTCModulePackage;
public class MainApplication extends Application implements ReactApplication {
private static List<ReactPackage> getPackages() {
return Arrays.<ReactPackage>asList(
/* Initialise foreground service, incall manager and webrtc module */
new ForegroundServicePackage(),
new InCallManagerPackage(),
new WebRTCModulePackage(),
);
}
}
Some devices might face WebRTC problems and to solve that update your android/gradle.properties file with the following
/* This one fixes a weird WebRTC runtime problem on some devices. */
android.enableDexingArtifactTransform.desugaring=false
If you use proguard , make the changes shown below in android/app/proguard-rules.pro file (this is optional)
-keep class org.webrtc.** { *; }
Step 3: Update colors.xml file with some new colours for internal dependencies.
<resources>
<item name="red" type="color">#FC0303</item>
<integer-array name="androidcolors">
<item>@color/red</item>
</integer-array>
</resources>
iOS Setup
Step 1: Install react-native-incallmanager
$ yarn add @videosdk.live/react-native-incallmanager
Step 2: Make sure you are using CocoaPods 1.10 or higher. To update CocoaPods, you can simply install the gem again.
$[sudo] gem install cocoapods
Step 3: Manually linking (if react-native-incall-manager is not linked automatically)
-
Drag node_modules/@videosdk.live/react-native-incall-manager/ios/RNInCallManager.xcodeproj under <your_xcode_project>/Libraries
-
Select <your_xcode_project> --> Build Phases --> Link Binary With Libraries
-
Drag Libraries/RNInCallManager.xcodeproj/Products/libRNInCallManager.a to Link Binary With Libraries
-
Select <your_xcode_project> --> Build Settings In Header Search Paths, add $(SRCROOT)/../node_modules/@videosdk.live/react-native-incall-manager/ios/RNInCallManager
Step 4: Change path of react-native-webrtc
pod ‘react-native-webrtc’, :path => ‘../node_modules/@videosdk.live/react-native-webrtc’
Step 5: Change your platform version
- You have change platform field of podfile to 11.0 or above it, as react-native-webrtc doesn’t support IOS < 11 platform :ios, ‘11.0’
Step 6: After updating the version, you have to install pods
Pod install
Step 7: Then Add “libreact-native-webrtc.a” in Link Binary with libraries. In target of main project folder.
Step 8: Now add following permissions to info.plist (project folder/IOS/projectname/info.plist):
<key>NSCameraUsageDescription</key>
<string>Camera permission description</string>
<key>NSMicrophoneUsageDescription</key>
<string>Microphone permission description</string>
Registering and Initializing VideoSDK Services
Register VideoSDK services in root index.js file for initialization service.
import { register } from '@videosdk.live/react-native-sdk';
import { AppRegistry } from 'react-native';
import { name as appName } from './app.json';
import App from './src/App.js';
// Register the service
register();
AppRegistry.registerComponent(appName, () => App);
Building the App Interface and Controls
Step 1: Before jumping to anything else, we have to write API to generate a unique meetingId. You will require an auth token, you can generate it either by using videosdk-rtc-api-server-examples or generate it from the VideoSDK Dashboard for developer.
export const token = "<Generated-from-dashbaord>";
// API call to create meeting
export const createMeeting = async ({ token }) => {
const res = await fetch(`https://api.videosdk.live/v1/meetings`, {
method: "POST",
headers: {
authorization: `${token}`,
"Content-Type": "application/json",
},
body: JSON.stringify({ region: "sg001" }),
});
const { meetingId } = await res.json();
return meetingId;
};
Step 2: To build up the wireframe of App.js, we are going to use Video SDK Hooks and Context Providers. Video SDK provides us MeetingProvider, MeetingConsumer, useMeeting, and useParticipant hooks. Let's understand each of them.
First, we will explore Context Provider and Consumer. Context is primarily used when some data needs to be accessible by many components at different nesting levels.
- MeetingProvider: It is a Context Provider. It accepts value
config
andtoken
as props. The Provider component accepts a value prop to be passed to consuming components that are descendants of this Provider. One Provider can be connected to many consumers. Providers can be nested to override values deeper within the tree. - MeetingConsumer: It is Context Consumer. All consumers that are descendants of a Provider will re-render whenever the Provider’s value prop changes.
- useMeeting: It is meeting react hook API for the meeting. It includes all the information related to meetings such as joining, leaving, enabling/disabling the mic or webcam, etc.
- useParticipant: It is participant hook API. useParticipant hook is responsible for handling all the events and props related to one particular participant such as name, webcamStream, micStream, etc.
Meeting Context helps to listen to all the changes when a participant joins a meeting or changes the mic or camera etc.
Let's get started with changing couple of lines of code in App.js
import React, { useState } from "react";
import {
SafeAreaView,
TouchableOpacity,
Text,
TextInput,
View,
FlatList,
} from "react-native";
import {
MeetingProvider,
useMeeting,
useParticipant,
MediaStream,
RTCView,
} from "@videosdk.live/react-native-sdk";
import { createMeeting, token } from "./api";
function JoinScreen(props) {
return null;
}
function ControlsContainer() {
return null;
}
function MeetingView() {
return null;
}
export default function App() {
const [meetingId, setMeetingId] = useState(null);
const getMeetingId = async (id) => {
const meetingId = id == null ? await createMeeting({ token }) : id;
setMeetingId(meetingId);
};
return meetingId ? (
<SafeAreaView style={{ flex: 1, backgroundColor: "#F6F6FF" }}>
<MeetingProvider
config={{
meetingId,
micEnabled: false,
webcamEnabled: true,
name: "Test User",
}}
token={token}
>
<MeetingView />
</MeetingProvider>
</SafeAreaView>
) : (
<JoinScreen getMeetingId={getMeetingId} />
);
}
Step 3: Lets now add a Join Screen to our app with which you can create new meetings or join existing meetings.
function JoinScreen(props) {
const [meetingVal, setMeetingVal] = useState("");
return (
<SafeAreaView
style={{
flex: 1,
backgroundColor: "#F6F6FF",
justifyContent: "center",
paddingHorizontal: 6 * 10,
}}
>
<TouchableOpacity
onPress={() => {
props.getMeetingId();
}}
style={{ backgroundColor: "#1178F8", padding: 12, borderRadius: 6 }}
>
<Text style={{ color: "white", alignSelf: "center", fontSize: 18 }}>
Create Meeting
</Text>
</TouchableOpacity>
<Text
style={{
alignSelf: "center",
fontSize: 22,
marginVertical: 16,
fontStyle: "italic",
color: "grey",
}}
>
---------- OR ----------
</Text>
<TextInput
value={meetingVal}
onChangeText={setMeetingVal}
placeholder={"XXXX-XXXX-XXXX"}
style={{
padding: 12,
borderWidth: 1,
borderRadius: 6,
fontStyle: "italic",
}}
/>
<TouchableOpacity
style={{
backgroundColor: "#1178F8",
padding: 12,
marginTop: 14,
borderRadius: 6,
}}
onPress={() => {
props.getMeetingId(meetingVal);
}}
>
<Text style={{ color: "white", alignSelf: "center", fontSize: 18 }}>
Join Meeting
</Text>
</TouchableOpacity>
</SafeAreaView>
);
}
Step 4: The next step is to create a ControlsContainer component that manages features such as Joining or leaving Meeting and Enable or Disable Webcam/Mic.
In this steps, we will use useMeeting hook to get all required methods such as join(), leave(), toggleWebcam(), and toggleMic().
So let's update ControlsContainer and add it to our MeetingView.
const Button = ({ onPress, buttonText, backgroundColor }) => {
return (
<TouchableOpacity
onPress={onPress}
style={{
backgroundColor: backgroundColor,
justifyContent: "center",
alignItems: "center",
padding: 12,
borderRadius: 4,
}}
>
<Text style={{ color: "white", fontSize: 12 }}>{buttonText}</Text>
</TouchableOpacity>
);
};
function ControlsContainer({ join, leave, toggleWebcam, toggleMic }) {
return (
<View
style={{
padding: 24,
flexDirection: "row",
justifyContent: "space-between",
}}
>
<Button
onPress={() => {
join();
}}
buttonText={"Join"}
backgroundColor={"#1178F8"}
/>
<Button
onPress={() => {
toggleWebcam();
}}
buttonText={"Toggle Webcam"}
backgroundColor={"#1178F8"}
/>
<Button
onPress={() => {
toggleMic();
}}
buttonText={"Toggle Mic"}
backgroundColor={"#1178F8"}
/>
<Button
onPress={() => {
leave();
}}
buttonText={"Leave"}
backgroundColor={"#FF0000"}
/>
</View>
);
}
function ParticipantList() {
return null;
}
function MeetingView() {
const { join, leave, toggleWebcam, toggleMic, meetingId } = useMeeting({});
return (
<View style={{ flex: 1 }}>
{meetingId ? (
<Text style={{ fontSize: 18, padding: 12 }}>
Meeting Id :{meetingId}
</Text>
) : null}
<ParticipantList /> // Will implement in next steps
<ControlsContainer
join={join}
leave={leave}
toggleWebcam={toggleWebcam}
toggleMic={toggleMic}
/>
</View>
);
}
Step 5: After implementing controls, it's time to render joined participants.
We will get joined participants from useMeeting Hook.
function ParticipantView() {
return null;
}
function ParticipantList({ participants }) {
return participants.length > 0 ? (
<FlatList
data={participants}
renderItem={({ item }) => {
return <ParticipantView participantId={item} />;
}}
/>
) : (
<View
style={{
flex: 1,
backgroundColor: "#F6F6FF",
justifyContent: "center",
alignItems: "center",
}}
>
<Text style={{ fontSize: 20 }}>Press Join button to enter meeting.</Text>
</View>
);
}
function MeetingView() {
// Get `participants` from useMeeting Hook
const { join, leave, toggleWebcam, toggleMic, participants } = useMeeting({});
const participantsArrId = [...participants.keys()]; // Add this line
return (
<View style={{ flex: 1 }}>
<ParticipantList participants={participantsArrId} /> // Pass participants
<ControlsContainer
join={join}
leave={leave}
toggleWebcam={toggleWebcam}
toggleMic={toggleMic}
/>
</View>
);
}
Step 6: The next step is to update the participant view to show the participant media i.e. video and audio. Before handling participant media, We need to understand a couple of concepts.
1. useParticipant Hook
useParticipant hook is responsible for handling all the properties and events of one particular participant who joined in the meeting. It will take participantId as an argument.
//Example for useParticipant Hook
const { webcamStream, webcamOn, displayName } = useParticipant(participantId);
2. MediaStream API
MediaStream is useful to add MediaTrack to the RTCView
component to play the audio and video.
//MediaStream API example
<RTCView
streamURL={new MediaStream([webcamStream.track]).toURL()}
objectFit={"cover"}
style={{
height: 300,
marginVertical: 8,
marginHorizontal: 8,
}}
/>
So let us combine these two concepts and render the participant view.
function ParticipantView({ participantId }) {
const { webcamStream, webcamOn } = useParticipant(participantId);
return webcamOn ? (
<RTCView
streamURL={new MediaStream([webcamStream.track]).toURL()}
objectFit={"cover"}
style={{
height: 300,
marginVertical: 8,
marginHorizontal: 8,
}}
/>
) : (
<View
style={{
backgroundColor: "grey",
height: 300,
justifyContent: "center",
alignItems: "center",
}}
>
<Text style={{ fontSize: 16 }}>NO MEDIA</Text>
</View>
);
}
Launching Your Video Calling App
//for android
npx react-native run-android
//for ios
npx react-native run-ios
Conclusion
With this, we successfully built the React Native video calling app using the video SDK in React-Native. If you wish to add functionalities like chat messaging, and screen sharing, you can always check out our documentation. If you face any difficulty with the implementation you can connect with us on our discord community.