Create Virtual Camera

In the previous stories, I talk about “How to use a virtual background in the google meet for mac os”. We use a Snap Camera to build a virtual background with an existing lens. With the assistance of AlterCam's virtual web camera, users can broadcast pre-recorded video files in real-time!Spice up your webcasts by including all sorts of cool visual presentations that let you show and tell. Slideshows, screencasts, movies and more; anything saved as a video can be mounted in place of your regular webcam stream.

Supported Bit Versions
32-bit, 64-bit
Source Code URL
https://github.com/Fenrirthviti/obs-virtual-cam/releases
Minimum OBS Studio Version
25.0.0
Supported Platforms
Windows
NOTE: The horizontal flip option is bugged and will likely cause crashes. Please do not use it. If you need to flip your video, either flip the sources in OBS itself, or flip on the receiving end (i.e. in Zoom, Skype, etc.)
This plugin provides a DirectShow Output as a virtual webcam.
How to use:
OBS Virtualcam has two main methods for outputting video from OBS. The first is the Preview output, which is enabled from the Tools menu. This output will provide exactly what you see in the Preview in OBS, including any changes or scenes you might switch to. This is the most common method, and probably what you would want to use.
Preview Output:
1. Select Tools -> VirtualCam in the main OBS Studio window
2. Press the Start button, then close the dialog
3. Open your program (Zoom, Hangouts, Skype, etc.) and choose OBS-Camera as your webcam
The next method is a filter that you can add to any scene or source, if you only want to output that specific scene or source, and nothing else.
Source Filter:
1. Add a VirtualCam filter to the scene/source you want to output to the virtual camera
2. Choose a camera target then press Start
3. If the button does not change to Stop, it means the camera is already in use, and you must choose a different camera or stop the other output first.
4. Open your program (Zoom, Hangouts, Skype, etc.) and choose the camera you selected as the target as your webcam
Why are the resolution and framerate sometimes not the same as my OBS output settings?
If you open an OBS-Camera device in a 3rd party application before starting the output in OBS, OBS-Camera will default to 1080p 30fps. If you start OBS first, it will use whatever is set as the Output resolution and framerate in OBS Studio's options, under Settings -> Video.
Does this plugin support other platforms?
For Linux, you can use the Video4Linux sink plugin for OBS Studio. Directions on how to configure it are available from that link. Work is underway to provide a similar plugin for macOS, but there is no ETA.
Known issues
- Skype from the Windows Store doesn't work with the OBS-Camera, please use Skype Desktop Edition instead.
- Unity can't get image from virtualcam, please useregister batch(reg_path.reg) in release page to manually add dummy device path .(note: The registry conflict with skype desktop , you can use unreg_path.reg to remove it)
Donate
Please consider donating to CatxFish, the original author of this plugin. Paypal.me
Reactions:gatekeeper1122, Liniaro, chrisinhim and 2 others

What you'll get from this page: Advanced tips on Cinemachine, the de-facto camera system in Unity. Underneath the surface, there is a refined system with loads of options and possibilities. Read on for more advanced use cases and tips for Cinemachine powered by Ciro Continisio, a Technical Evangelist at Unity.

Multiple Cinemachine Brains

Sometimes you need more than one camera in the scene for a split-screen multiplayer, or maybe you want to create a picture-in-picture effect (e.g. a surveillance camera). Even though it might seem that Cinemachine takes control of all of your Cameras, through the Brain component, there is a way to have multiple Brains in the scene, each one looking at a set of Virtual Cameras.

In this situation, you can see a UI that has a video feed in it just by using a secondary Camera pointed at the characters faces and rendering to a RenderTexture. To give a bit more flavor to this video feed, you can use Cinemachine to keep the characters’ faces in the frame.

Set up the Brains

The setup here is really simple: you create two Unity Cameras and attach a Cinemachine Brain to each. You can then create as many Virtual Cameras as you need.

To make sure a Brain only sees some of these VCams, you just need to do three things:

Create Virtual Camera
  1. Make sure the VCams you want to affect a specific Camera are on a unique layer (for instance, “Characters”).
  2. Set that Camera’s Culling Mask to render that layer.
  3. Set the other Camera’s Culling Mask to exclude that layer.

Keep in mind that even if the process is using the Culling Mask of the Camera, this doesn’t mean you need to change your rendering. With 31 Layers to play with, you can just create one specifically to put VCams on.

Camera

The full process is also recapped in the video at the end of this article

A different “World Up” axis

Cinemachine operates on the assumption that there is a “World Up” axis, which by default is the Y up. By using that as a reference, it knows which camera movement not to perform, for instance looking straight up or straight down at a target (real cameras don’t do that!).

In the case of 2D Cameras (Cameras marked as Orthographic), Cinemachine behaves differently and constraints the camera movements on the ground plane, which is going to be the plane made up of the two remaining axes, in this case, X and Z.

However, sometimes it’s necessary to force this “World Up” axis and the corresponding plane to something else. For instance, maybe you’re working on a game that uses 2D physics so you’re forced to use the XY plane for your gameplay. But, maybe you want your camera to look at this plane from an angle, therefore your camera plane is a bit tilted, like in the image below:

To do so, you just need to assign a transform to the “World Up Override” property on the Cinemachine Brain. You can create an empty GameObject and use it as a manipulator, and you can experiment by rotating this object to find the appropriate “World Up” for your Cinemachine setup.

Apply smart Post-Processing

Create

Cinemachine works with both Post-Processing stack v1 and the v2, but the latter is the best fit because it allows some nice tricks when used together with Cinemachine due to its volume-based system.

When you use them together, it’s recommended that you split up your effects into two categories:

  1. General effects that should affect every shot.
  2. Effects that are unique to one shot (or, in Cinemachine lingo, VirtualCamera).

In general, effects like AmbientOcclusion, Colour Grading, Grain, and other style effects are usually part of the first group, since you want to keep your style consistent through your game or film. Effects like Depth of Field, Lens Distortion, Chromatic Aberration and other effects simulating the physical properties of the camera are good to live on the Virtual Cameras. These are not hard rules though!

For instance, in the example above, you’ll notice the use of a very local Post-Processing volume on to give the impression of looking through the lens of an old security camera. See the next section for how to set that up in the Editor.

Set up effects on a Virtual Camera

To achieve the above effect, you just need a Post Process Volume on the Virtual Camera itself which is not marked as “Is Global”, and add a collider. Usually, a very small Sphere Collider will do (maybe 0.05 in radius) so make sure that the Camera doesn’t enter this volume by mistake during gameplay. So, you're going to have Virtual Camera, Post Process Volume and Collider on the same object.

Create Virtual Camera App

Then, you can add all the effects you want, and even override some on the main profile by ticking the appropriate checkboxes. Remember, when you don’t tick a checkbox you’re staying on the base value for that property, usually coming from the global volume. If you do check it, you’re overriding it only for this volume.

Create Virtual Camera Linux

Create a Dolly Zoom effect

Finally, a little cinematography trick. Ever seen the so-called “Dolly Zoom” effect? It was pioneered in the movie “Vertigo” by Alfred Hitchcock in 1958, and since then used in a bunch of movies.

On screen, it looks as if the space between the character and the background is expanding or contracting. It’s a pretty cool technique, and it’s very easy to recreate with Cinemachine.

Have a look at the example above, which is set to trigger in the middle of gameplay.

VirtualCreate Virtual Camera

To create that effect, you can start with two Virtual Cameras with the same settings. Then you need to pull one of them back and reduce its Field of View value. When you blend between the two, via Timeline or using the Priority, you get a Dolly Zoom effect.

Create Virtual People

Remember that this effect only works with Perspective Cameras only, not with the Orthographic ones!

Create Virtual Camera App

Yes!

Create Virtual Camera Apps

Meh.