Need help configuring a computer system to run Nvidia's Isaac Sim 5.0 (Robot Operating System SW)
I have built hobby robots before, but I just discovered the Nvidia Jetson Orin Nano Super Development Kit and want to build a robot using it. Nvidia’s GPU optimized robotics software stack is available to run on it, so it would be exciting to see what it can do.
This new project is coupled with having to replace my existing development PC because it is too old to transition to Windows 11. (Microsoft will stop supporting Windows 10 in a few months.)
So, I wanted to replace my PC with one that can run Nvidia's Isaac ROS software. I also plan to add custom modules to the software to support unique features of my robot.
The most challenging requirements are from Isaac Sim 5, which allows robots to be modeled, developed, and tested in Nvidia’s Omniverse. (https://docs.isaacsim.omniverse.nvidia.com/latest/index.html) It requires a several core gaming class computer with 64 GB RAM and a Nvidia gpu graphics card that has 16GB of memory and is more powerful as the old GeForce RTX 3070. (https://docs.isaacsim.omniverse.nvidia.com/latest/installation/requirements.html)
I would like to build this computer for about $2K and my hard limit is $3K.
So far, this is the build that I am considering.
My thoughts are…
A dual boot computer: Windows 11 Pro for my normal work. And, Ubuntu 22.04 OS for running the Robot Operating System (ROS2) software. I would rather have my base operating system be Windows 11 and run a Hyper-V virtual machine with Ubuntu running inside the VM. However, Nvidia recommends using the non-VM solution and will not support problem resolution for problems being reported in a VM environment. So, I need a pure Ubuntu environment to report bugs from.
GPUs: I would like the combination of a high-performance GPU card for modeling and simulation, but then switch to the CPU's integrated graphics when I don't need the performance. The two GPUs would also come in handy when using the Hyper-V VM. Hyper-V permits assigning hardware to the OS running inside the VM. My Windows host system can continue to communicate via the integrated GPU while the Ubuntu/ROS software have complete use of the Nvidia GPU inside the VM. (I plan to assign the Nvidia graphics card, a USB port, and perhaps the Ethernet (RJ45) hardware to Ubuntu.) The GeForce RTX 3070 is ancient; I'd rather invest in something newer; maybe a Nvidia RTX 5000 class GPU card that has at least 16GB of memory. The GeForce RTX 5060-Ti chip-set fits the need. Since the GeForce RTX 5000 series chip-sets support PCIe 5.0 x16, that became a requirement of the motherboard. I expect to install this graphics card in PCIe connected directly to the CPU. Since I am selecting an Asus TUF motherboard, I selected an Asus TUF graphics card.
CPU: I want lots of cores. Robotics calculations are highly parallelable; especially the big software builds. My soon-to-be-obsolete PC is an Intel i7 (7th gen; 4-cores, 8-threads). So, I was thinking about going more threads. Either the Intel Ultra 9 (24-core, 1 thread per core) or the AMD Rysen 9 9900X3D (12-core, 24-thread) processors. When I examined their respective supporting chip-sets, I decided to choose the AMD Rysen because it supported the more standard (in my belief) USB-4, where Intel chose to support Intel® Thunderbolt video instead of USB. At this point, I don't believe Intel is strong enough to win promoting a proprietary interface over the next generation of a standard.
Motherboard: Although this will be my home desktop PC, I am likely to need to transport it to "show-and-tell" locations. My last two generations of motherboards have been Asus TUF boards. They have performed flawlessly, so I guess I am biased. I selected the Asus TUF Gaming X870-Plus WiFi motherboard.
Two hard drives: Previously I have been plagued by the new UEFI BIOS security code interfering with dual-boot configurations. I have found it's easier to dedicate one drive to Windows and another to Linux. Then use the BIOS controls to select a boot drive. Having NTFS and ext4 partitions on the same drive led to unnecessary complications. But why hard-drives? Because I have burnt out SSD memory boards in a few months. SSD memory wears out when the same locations have been overwritten a few hundred thousand times. Their file systems have load-leveling algorithms to prevent the same physical locations from being constantly overwritten, But Windows loves to constantly update its Registry file, Each tiny change causes a 4k-block to be rewritten. Microsoft understands this and started publishing tips to prevent Windows from constantly rewriting. But I believe that only the PC OEMs know enough of these secret switches to make a SSD drive last multiple years.
Also, SSD memory is certainly not appropriate for use as OS swap space. I expect that when using a ROS VM, large portions of the host Windows environment will be cast-out to the swap area. While experimenting and tuning my memory configuration, swap space use may become excessive.
RAM memory: The general approach for AI computation is to have enough RAM memory to hold all the working data. When tuned correctly, a system would only need RAM and SSD memory. I am not smart enough to be able to predict the right amount of RAM for my project. I just know I need a lot and that the Nvidia Isaac SIM 5.0 documentation recommends at least 64GB. So, I want a motherboard that initially comes with 64GB, but has additional sockets, so that I can increase it in the future. Common motherboards have 4 RAM sockets and are initially configures with two 32GB memory sticks. That leaves adequate expansion space.
M.2 NVMe SSD memory: The one thing I love about SSD memory is that it makes my computer boot very quickly. I want a M2 SSD to boot windows from, so, I specified a 1TB SSD to as my windows C-disk (mounted in M.2_3). I also want a second 1TB SSD to be formatted ext4 for Ubuntu use (mounted in M.2_1, driven directly by the CPU).
CPU Cooler: I wanted a CPU cooler that had an LCD display for temperature. This ROG cooler is over-kill, but the others had poor or no reviews.
Display: I don't expect the link from the graphics card to the display to be a bottleneck. It seems that a HDMI2 interface to a 60 fps 4K display is adequate.
Chassis: Any decent Mid-Tower ATX chassis capable of cooling 800 Watts should work fine. The one I chose has two 3.5" HDD bays.
Power Supply: Calculations indicate that the components that I chose can use about 800W. So, I selected a power supply with 25% more than my current calculated need.
Your comments are welcome: I’m hoping that other robotics enthusiasts will respond. If there is enough interest, then Micro Center might create a robotics bundle and bring the price down.
Categories
- All Categories
- 1 The Blog
- 1 What's Trending
- 7.7K The Community
- 3.1K General Discussion
- 130 New Members
- 832 Consumer Tech
- 215 Prebuilt PCs and Laptops
- 163 Software
- 31 Audio/Visual
- 51 Networking & Security
- 4 Home Automation
- 5 Digital Photography
- 13 Content Creators
- 29 Hobby Boards & Projects
- 80 3D Printing
- 83 Retro Arcade/Gaming
- 60 All Other Tech
- 353 PowerSpec
- 2.6K Store Information and Policy
- 148 Off Topic
- 52 Community Ideas & Feedback
- 612 Your Completed Builds
- 3.9K Build-Your-Own PC
- 2.8K Help Choosing Parts
- 326 Graphics Cards
- 334 CPUs, Memory, and Motherboards
- 141 Cases and Power Supplies
- 54 Air and Liquid Cooling
- 47 Monitors and Displays
- 90 Peripherals
- 64 All Other Parts
- 64 Featured Categories
We love seeing what our customers build
Submit photos and a description of your PC to our build showcase
Submit NowLooking for a little inspiration?
See other custom PC builds and get some ideas for what can be done
View Build ShowcaseSAME DAY CUSTOM BUILD SERVICE
If You Can Dream it, We Can Build it.

Services starting at $149.99