For example, you can use a wired box to send signals to a wireless one. There are wired and wireless options available for sync generators, and it is common to use both for an in-camera VFX set. Most 10 GbE Layer 2 or Layer 3 type network switches, such as the Netgear Smart Switch, should be sufficient for this scenario. It is recommended that you use M.2 Solid State Drives (SSDs), such as the Samsung 970 Pro, as the secondary data drives from the machine's boot drive.Ī 10-Gigabit Ethernet (GbE) Network Interface Controller (NIC) is recommended to maintain high-speed data transfer between operator systems and render machines. SDI video cards such as the AJA Kona 5 and the Blackmagic DeckLink are recommended for live compositing.īecause your project data is localized to each computer, fast local storage is necessary for optimal performance. If you plan to use live green-screen compositing, you will need a SDI video card to handle camera input, compositing output, and timecode synchronization. Select your card type and OS and set Download Type to Production Branch / Studio to find the recommended driver. You can find the recommended driver for Virtual Production on NVIDIA"s Download Drivers page. In particular, when using multiple GPUs on a single workstation, make sure to use version R512.59 or later. We generally recommend using the latest drivers from NVIDIA. If you plan to make use of nDisplay's support for rendering to multiple GPUs on a single computer, all graphics cards must also support NVLINK. You can find more details in NVIDIA's Quadro Sync II User Guide.įor render nodes with either single or dual GPU, we recommend the following NVIDIA RTX cards:įor a complete list of compatible graphics cards with NVIDIA Quadro Sync, refer to the NVIDIA page for Quadro Sync. Each render node machine must have this card in addition to its graphics card. NVIDIA Quadro Sync is required to synchronize displays across an LED volume. Additional RAM may be required if your production uses large files.įor artists using ray tracing and other advanced rendering features in Unreal Engine, we recommend the professional-level NVIDIA RTX cards:Īrtists with lighter rendering needs may be able to use the consumer-level NVIDIA RTX cards: Common examples of CPUs used for the in-camera VFX scenario include the Intel Xeon and Intel Core i9 processors, as well as the AMD Ryzen 9 3950X, AMD Threadrippers, and AMD Threadripper Pro.Ħ4GB of DDR-4 memory is the recommended minimum for most in-camera VFX scenarios. It is recommended that you have at least a three gigahertz (Ghz) clock speed as a starting point. Beyond eight cores, the benefits of additional cores will be more noticeable in the compile time for code and shaders than other use cases. In general, you should favor faster clock speeds over more cores. Make sure your workstation has enough space and can support NVLink if you want to utilize multi-GPU, available starting in Unreal Engine 4.26. In this guide you connected your Master Lockit Device by using Live Link and streamed Focus Iris and Zoom from a Smart Lens on the Production camera.When you select a workstation, it is important that you have adequate space and power for your memory, storage, and graphics hardware. This stops the focus changing on the CG Cinema Camera as focus is being controlled by the physical camera Verify that changing the focus and aperture on the physical camera updates those same settings in the CineCamera Actor.Ĭhange the Focus Method setting from Manual to Disabled. Select the Camera Component and scroll down to the Focus Settings. Click the Subject Representation dropdown and select your Master Lockit Device. Select the Live Link Controller component and scroll down to the Live Link section. Select the CineCamera Actor in the World Outliner window and go to the Details panel. Confirm the values are updating correctly in the Live Link section. Select the Master Lockit Device in the window and click View Options, then select Show Frame Data. Enter the Master Lockit's IP Address and click OK. Once the editor finishes loading, go to the Live Link window and click + Source > MasterLockit. Click the Virtual Production category and search for the LiveLinkMasterLockit plugin.Įnable the plugin and click Yes on the message window. The Smart lens on the production camera will be connected to the Master Lockit Plus and the Master Lockit Plus will connect to the Unreal Engine workstation network by Ethernet.Ĭlick Settings > Plugins to open the Plugins Menu. In this section you will connect a Smart lens using an Ambient Master Lockit Plus device using Live Link.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |