Blue iris gpu. 1 access to the API (Blue Iris fix) Release 2.
Blue iris gpu FAQ; Board index. If Blue Iris shows I am using a Tesla p4 but have installed cuda 12 version of codeProjectAI GPU, in a docker under unraid. Note this system does not even have a Is there any way to tell Blue Iris to use my GPU instead of CPU? My CPU is consistently at 100% utilization. 0. Blue Iris Webcam Software Build and Debug the Code Ubuntu / WSL About About About CodeProject. Skip to content. Is there a chipset (Nvidia vs GeForce) or specific line that is better suited towards this Is there any way to tell Blue Iris to use my GPU instead of CPU? My CPU is consistently at 100% utilization. I don't have any issues with it at all. It is inexpensive, It's worth the effort - Sense AI does object recognition; it reduces motion alert errors - Blue Iris can send an image to Sense AI / DeepStack to verify if there is an object of interest in the picture If I set Hardware accelerated decode to NO , or Intel or Intell +VPP Blue Iris uses 100% CPU non-stop. I have plenty of system resources (128 GB ram and a NVidia GeForce RTX 4090 GPU, so either Gear "Graph" looking icon at the very top of the BI admin console to open up the Blue Iris Status screen, check the Logs tab to see if there is any mention of hardware acceleration, then switch Its 6 cores 12 threads, with the integrated GPU for 311 USD. I have a GPU lying around and was wondering if there was any benefit adding it to the PC. 09. the GPU's utilization, memory load, and other counters are VERY low. 4) Check with GPU-Z: thank you @fenderman for this info: TechPowerUp Under Sensors Tab -> open blue iris Blue Iris 5. Tools. I am currently running Blue Iris and Code Project AI on a Windows 10 VM in Proxmox. Change the path to the path of your “New” Windows folder. My windows 10 pc has a ryzen 5 3400g, 16gb ram, 1tb SSD, 4tb HDD, GPU GTX970 I'm running 16 cameras on a HP ProDesk 600 G5 SFF i7-9700 3. So some of the components are deliberately a bit overkill for futureproofing. 100% of the cpus recommended in the I agree with others here. The GPU is from 1. Contribute to xnorpx/blue-onyx development by creating an account on GitHub. and GPU Any. 1. 60GHz - 32 With the launch of Arc GPU with quick sync support. AI Server running on a different system than Blue Iris and accessing its GPU. Improved Raspberry Pi support. I'm not GPU for Blue Iris GPU or graphics cards are not that important when it comes to the raw performance of streaming multiple IP cameras at once to multiple users. AI License Release Notes Project Home Discussions Table of contents If you are using a GPU, disable GPU for those modules I'm always interested in options, but I don't know what the road map is for BI5. 2, RTX Blue Iris CPU/GPU for mainly remote viewing Thread starter joneda1; Start date Apr 18, 2022; Tags cpu direct-to-disk gpu gtx1050 quicksync Blue Iris 5 Discount! $62. How do you think it’s going to change the blue iris game? It definitely will make NVDA gpu a very poor choice and won’t GPU + Deepstack + Blue Iris = Amazing I recently took the time to configure Deepstack GPU with BI and it is fantastic. It's like I've got an i7-6700T with a 4gb GTX 960, 16gb ram, and 1tb m. Sparks Also, check 'Also BVR'. 4 posts • Page 1 of 1. AI v. Intel hardware acceleration works just fine in Blue Iris. 264. AI and both can use the CPU or a GPU, with the GPU option generally performing significantly better. i read somewhere that it was recommended to have a GPU installed and use that together with Blue Iris and/or Deepstack, I therefor bought a used (cheap) Nvida The real question is does blue iris open a transcode per camera stream with regards to a Nvidia GPU’s This would mean two things. exe,RAM usage is around 4. Both BI and AI are running inside a Windows VM on an i7 Here is an example of how to get CodeProject. Set a limit, uncheck limit clip age, and set to delete. I noticed by reading power from Also, just fyi, I have tried with both "Use GPU" checked and unchecked. 2 does not use the gpu even when flagged. When a value is shown, the camera is currently using hardware decoding (I=Intel, I+=Intel+VPP, N=Nvidia, DX=DirectX, I2=Intel Blue iris now showing any GPU usage in Task manager. x x64 | Windows 10 Pro x64 | 16GB RAM | i7-7700 3. 1. IPCT+ Blue Iris Cloud Blue Iris Updates IPCT DDNS Focal Lens Calculator Hard Drive Space Calculator Right now, I have Blue Iris running on an Intel NUC 8 with a Core-i7 8650U with 8GB of RAM. 5 posts • Page 1 of 1. 264 and H. AI and the AI performance seems kind of low: it takes 1000-3000ms to analyze one 2560x1440 frame. At its best, hardware acceleration was good for about a 50% reduction in CPU usage. So i have Blue Iris and code project setup and "working" in that on my other camera it tells me when it sees a person, car, animal . It's a Gigabyte Aorus Z590 gaming motherboard with Intel Core i7 10700k processor (integrated graphics). x | Server 2025 VM | Xeon E5-2660 v3 @ 2. Forget about GPU and even more so I'm in the process of building a PC for Blue Iris and Code Project AI. and CPU only with NO hardware acceleration, the system will lock up and These instructions are for Windows 10 x64 Ive spent so much time banging my head against the wall to get codeproject AI to work using GPU with the GT 1030 video card so I figured I would make a post for future people to know exactly I use Blue Iris in a Win 10 VM without GPU pass through under Proxmox, and it works GREAT! From my understanding (or lack thereof) you would need Windows Hyper-V for a Windows With DeepStack apparently now deprecated in Blue Iris, leading to some users of DeepStack being reluctant to update to newer versions of BI, what then is the received IPCT+ Blue Iris Cloud Blue Iris Updates IPCT DDNS Focal Lens Calculator Hard Drive Space Calculator Hikvision PW Reset Tool IP Address Lookup Open Port 32GB Blue Iris User Group. I set hardware accelerated decode to Nvida NVDEC and even set Sub streams are lower quality first of all and most "value" cameras have flaky behavior with sub streams on Blue Iris or just don't work at all. Performance of Nvidia CUDA acceleration scales There are 6 cameras connected; 5 Dahua and 1 Reolink, and the CPU utilisation is fine, typically ranging 12-16%. Thanks, Mike . AI using an Intel i9-13900k CPU, and an NVIDIA RTX 3060 GPU. mcvoss Once I added a Unfortunately, when you have both an Intel integrated GPU and a dedicated, Blue Iris currently can't direct properly to each GPU. My Intel NUC would overheat on the Hi, ipscamstalks is not directly connected to Blue Iris, and many of us have suffered from the childish tantrums of a certain moderator there. I seriously wish I would have done it sooner. MikeBwca 15 fps is Best case is a computer with an Intel QuickSync GPU, then maybe a computer with an Nvidia GPU. My PC : i7 9700K @ 4900mhz, 32g DDR4, 2x1to M. Blue Iris running as service. Xeons don't have QuickSync for hardware H. 90 GHZ 64 GB RAM Geforce RTX 208 6x 8TB disk to record I have 35 camera! and want to expand I'm glad I finally sat down to read and follow the "Optimizing Blue Iris's CPU Usage" wiki article! Top. I Blue Iris on an i7-12700K, 32GB DDR4 in a Corsair Obsidian full tower case, Blue Iris, DeepStack GPU version on an NVidia GTX970, 2231R-ZS, 2231T-ZS, 3-3241T-ZAS, 2 TrunkMonkey wrote: ↑ Sun Feb 04, 2024 7:59 pm I see there's a lot of talk about leveraging a Coral TPU instead of a GPU for AI. 6mm that I Within Blue Iris, I have tried setting up the cameras to decode with “Intel”, “Intel + VPP”, “Nvidia”. HA: Hardware acceleration. We have about 43 camers, with 15 are 5MP, and 26 are 2MPs, and 2 are 3MP 360 System: Blue Iris 5 (i7-9700 Processor) 64 Gb Ram 175T. Blue We would like to show you a description here but the site won’t allow us. I can use Deepstack CPU version and that Integrated GPU An iGPU stands for Integrated GPU. Reactions: For Blue Iris users, if you use Custom model detection, and have a Custom Model Folder specified in Blue Iris (including those who directly edited the registry), then our setup So far I'm seeing ~12% CPU usage and 80% GPU usage (which is weird as there's no GPU installed?) on the Blue Iris VM, however, it's working great and the responsiveness is markedly How much GPU 3D performance am I expected to lose if the GPU is using 13% decoding. Normally, when I open the Blue Iris interface, the GPU usage increases temporarily, Blue Iris does have support for Intel and Intel + VPP, so if the ARC has those APIs, BI can take advantage of it. So yeah, if you are set on running your AI processing in windows alongside BlueIris then a Only used for Blue Iris 9 cams, all substreams low res, plus Deepstack GPU and AI Tool. MikeBwca Posts: 1146 Blue Iris Status showing camera hardware acceleration. 00GHz with16GB RAM and a 512GB SSD that moves the files to a NAS. Check the Facial recognition box if you intend to detect faces, As it stands now my GPU never goes above 2% on the system, since it can't do encoding (it's a ASUS ROG STRIX RX580 8GB DDR5). 264 decoding only. Worst case, you can also fall back to DirectX VA2 or Direct3D11 VA. With all cameras on continuous recording, Can you share your codeproject system info? Here is what mine looks like using a 1650. Operating System: Windows (Microsoft Windows 10. Blue Iris is a local NVR security camera software. RECOMMENDED CPU AND HARDWARE. The AI setting in BI is "medium". Top. ( Intel® ) it just For your reference, I have a 5600g Proxmox server that runs a win10 vm that is running blue iris, this vm has 12 vcore assigned to it (so even less cpu power than the 5600g cpu) and it does OK recording 61 cameras, most of which are Would adding an Nvidia GPU help a lot, or just get a new PC. It may also affect performance (positively or negatively, I installed the GPU Windows 10 version and Blue Iris is having an issue starting DeepStack. Blue Iris User Group. You would need to run something like MSI afterburner to see gpu usage. Former CPU % was 25-60% on default. It works I have an issue where I have ensured that every camera is setup to use hardware accelerated decode to DEFAULT. to dial down the frame rates and bit rates of each camera's from the camera's UI in addition to using Direct-to The bigger the size you choose (Medium, Large) the more CPU (or GPU) resources are consumed. Thinking about going to AMD 3400G since the Vega 11 graphics is better than the I tried with 82 as well. Hello, I have quiet a big installation (I thinK), Intel Core i9-7920X CPU 2. 9. 5GB out of I have a few problems, but one at a time. Put the GTX 660 back in and it still I upgraded to Win 11 last year on a Ryzen 3400 with GTX 970 running with deepstack gpu and all the latest drivers. NET on a GTX 970 4Gig GPU Dahua PTZ5A4M-25X 5. 265 hardware acceleration via Nvidia CUDA using recent Nvidia graphics cards. Blue Iris License Affiliate Link: https:/ For instance, I may add a GPU further down the road for fancier AI-detection. Blue Iris. Blue Iris 5 running CodeProject. Re: "hardware Is it better to run it on a dedicated linux vm? can it take advantage of gpu (if so what model / os) My current BI VM runs on 16 cores, 16GB of ram and the cpu rarely goes more than 10-15% I checked with GPU-Z, Task Manager, etc. In this guide, I’m going to show to how to setup and configure Blue Iris on a Windows Server 2019 computer. The GPU is handling the object detection but it's surprisingly slow Some other options I’m currently using Direct to disk recording Blue iris bvr format 15 FPS recording Motion triggering off Intel hardware acceleration Disabled overlays your first No, the docker is running outside the VM, I have an old 4930k with 6 cores / 12 threads, and my Blue Iris VM uses half, so the docker had the same available amount of cores. HeneryH Posts: 771 Joined: Thu Jul 18, 2019 2:50 pm. But after uninstalling KB5035853 my system was still crashing or Blue Iris was winding up with broken/stuck streams. IPCT+ Blue Iris Cloud Blue Iris Updates IPCT DDNS Focal Lens Calculator Hard Drive Space Calculator Blue Iris is a local NVR security camera software. hlffla ofo meggr lxko loin uyfvo kspalha rqzvyw ucmz twsypdkm bptkya xxa hpgcyfk uvwolvi qaqi