10-21-2013 03:40 PM
Hello!
I'm having a challenging time getting SuSE 11.2 to SAN-boot and am wondering what others have done to get this going.
I am aware of this page:
Specifically:
Step 6 | At the boot prompt on the first installation screen, enter the kernel modifier command. boot: linux pci=nomsi mpath |
But I kept getting:
Error No active partition
Error No active partition
Error No active partition
Reboot and Select proper Boot device
or Insert Boot Media in selected Boot device and press a key.
I am using an identical setup (Boot Policy same, same array, etc) as my other B200s that have successfully installed ESXi and bare metal RedHat, so I'm wondering what I might be missing to get my SuSE Server going?
All tips are appreciated! Thank you!!
10-21-2013 03:47 PM
On 6200 series FI's after updating from 2.0x to 2.1x RPM based linux systems stoped booting for me. Debian systems have not had the issue. This is with M3 blades.
My fix was local drives since only one server was effected and I had drives around.
What code are you running?
Craig
10-21-2013 03:57 PM
Ah that's interesting, thanks Craig. I'm running 2.1(2a).
This is on a B200 M2 though I do have some M3s available.
Unfortunately, local boot is not an option... we need to be 100% SAN.
10-21-2013 04:50 PM
It is always best to be 100% SAN.
2.1(2f) will run a little better but will not fix the issue.
I would try downgrading the code on the blade to 2.0(x) and seeing if that fixes the issues.
10-22-2013 10:37 AM
Thanks Craig.
Rolling back to 2.0(x) won't be an option for us. I could probably justify our moving up to 2.1(2f) but it seems that didn't actually resolve the problem for you.
It's unfortunate you and I are running into this issue... I'm hoping someone else has found a Cisco fix or perhaps an ISO of the necessary SuSE drivers to get SuSE boot-from-SAN working.
Thanks again!
Find answers to your questions by entering keywords or phrases in the Search bar above. New here? Use these resources to familiarize yourself with the community: