11-15-2022 09:50 AM
I've configured boot from SAN on UCS with both iSCSI and FC a number of times, but either I'm getting rusty or something has changed. UCS cluster is running 4.1. I have two older M4 blades that I'm trying to boot from a Nimble iSCSI SAN. After creating my service profiles and assigning them to the blades, then booting the blades. I see the iSCSI bios get installed during boot, and it sees the Nimble as my target. I'm installing VMware ESX7.0U3, and I've tried both the VMware vanilla ISO and the Cisco specific one. I can install the OS, and I see connections on my Nimble while the installation is taking place, and the LUN I created gets something transfered to it during the install. I also created a temporary SVI on my switch to the iSCSI Vlan and I can ping the iSCSI vNICs on my blade while the OS is installing. The installation process has to get passed the screen where it's loading modules (to the screen that is half yellow and half black if you know the ESX installer). The install seems to progress normally, but after eveything is done once I reboot the blade, it won't boot from SAN.
While trying to boot, I cannot ping the iSCSI vNICs on my blade/host. I also find it troubling that I cannot ping the iSCSI vNICs until I get pretty into the installer. For these to work as boot from SAN, they should be pingable as soon as the server is powered up. I cannot figure out where the problem is though. The iSCSI vNICs are pulling IP addresses from the IP pool I configured for them, they just don't seem to be doing it early enough in the boot process. I'm also confused how the iSCSI BIOS seems to be loading during boot up and connecting to the Nimble, but I cannot ping my iSCSI vNICs during that process.
Anyone have any ideas where to look?
05-10-2023 08:49 AM
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide