Start a Conversation

Solved!

Go to Solution

3250

June 4th, 2020 23:00

Dell Alienware Aurora R9 compatability issue with Ubuntu 18.04 , Dual boot.

Hello Dell Support, 

We are unable to proceed further with the installation of Ubuntu 18.04 on this dell alienware Aurora R9 model. 

Issue:- Getting blank screen after the initial install grub prompt.

Tried  acpi=off, nomodeset parameters added in grub still no luck

Note:- Ubuntu 20.04 LTS worked without any issues. 

Env:-

====

UEFI - Dual boot, secure boot disabled, SATA to AHCI

Bios vers - 1.0.7

$find /sys/devices -name "edid"

/sys/devices/pci0000:00/0000:00:02.0/drm/card0/card0-HDMI-A-1/edid

/sys/devices/pci0000:00/0000:00:02.0/drm/card0/card0-DP-1/edid

$lspci | grep VGA

01:00.0 VGA compatible controller: NVIDIA Corporation TU104 [GeForce RTX 2080 SUPER] (rev a1)

02:00.0 VGA compatible controller: NVIDIA Corporation TU104 [GeForce RTX 2080 SUPER] (rev a1)

===

Surprisingly this model has only one slot ( HDMI ). I checked in the service manual  "https://topics-cdn.dell.com/pdf/alienware-aurora-r9-desktop_service-manual_en-us.pdf" it's mentioned over there it contains multiple Display ports. I'm not sure why this model contains only one slot(HDMI)

As per Dell documentation -> https://www.dell.com/support/article/en-us/sln306327/manual-nomodeset-kernel-boot-line-option-for-linux-booting?lang=en to pass nomodeset to skip these issues. In our case that's also not working. 

nomodeset:- defined to  instructs the kernel to not load video drivers and use BIOS modes instead until X is loaded.

Please shed some views, how can I skip from the blank screen to proceed the installation process?

Thanks

Jaya

June 16th, 2020 03:00

Issue fixed.

Bydefault 18.04 marked incorrect disk for the boot loader installation which caused the machine not boot properly. Due to this EFI mounts from secondary disk causing the booting issue. 

Marked the correct disk during installation by choosing bootloader fixed the issue 

Please mark this request as closed 

Notes:-for dual boot in Aurora R9

For Ubuntu 18.04 / 16.04 below is the fix from my side

From BIOS

Disable secure boot,

Storage AHCI mode

Remove quiet and splash and add "nomodeset" from grub and install the OS. Mark correct disk from the bootloader list in the custom partitioning page

On First boot,  quiet and splash and add "nomodeset" and install the NVIDIA proprietary drivers preferably latest one ( 440 ) . Then reboot, it will work

June 7th, 2020 07:00

Current status:- Able to install 18.04 with passing "nomodeset" after upgrading BIOS to 1.0.8.
Post reboot failed, even after installed the NVIDIA driver - 440
Only boot, by passing nomodeset in grub.

Please investigate the logs and provide us the fix. how to safely boot along with NVIDIA graphics driver

$lspci | grep VGA
01:00.0 VGA compatible controller: NVIDIA Corporation Device 1e81 (rev a1)
02:00.0 VGA compatible controller: NVIDIA Corporation Device 1e81 (rev a1)

 

Refer bug -> https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1882195

June 7th, 2020 09:00

// Findings -

+++++

Issue:- Booting into recovery mode, after post installation

Tried nomodeset parameters added in grub, able to install after that , machine booting in recovery mode on post boot.

Note:- Ubuntu 20.04 LTS worked without any issues.

Env:-
Dell Alienware Aurora R9
Ubuntu 18.04.4 LTS
5.3.0-28-generic
dosfstools 4.1-1
UEFI - Dual boot, secure boot disabled, SATA to AHCI
Bios vers - 1.0.7

$lspci | grep VGA
01:00.0 VGA compatible controller: NVIDIA Corporation TU104 [GeForce RTX 2080 SUPER] (rev a1)
02:00.0 VGA compatible controller: NVIDIA Corporation TU104 [GeForce RTX 2080 SUPER] (rev a1)

===

Below is the journal logs, for the failure of machine boots into emergency mode. For some reasons, while unpacking the initramfs, it's unable to mount the EFI partition and dropping into emergency mode. And it's points bad superblock

Might-be this is a bug in the 18.04. At the same time 20.04 worked without any issues.

~~~
Jun 07 17:08:32 test-Alienware-Aurora-R9 kernel: Initramfs unpacking failed: Decoding failed
Jun 07 17:08:32 test-Alienware-Aurora-R9 kernel: Freeing initrd memory: 48284K
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Found device PM981a NVMe SAMSUNG 2048GB ESP.
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Starting File System Check on /dev/disk/by-uuid/CCF3-E7D6...
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Started File System Check Daemon to report status.
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Found device ST2000DM008-2FR102 4.

Jun 07 17:08:33 test-Alienware-Aurora-R9 kernel: Adding 62499836k swap on /dev/sda4. Priority:-2 extents:1 across:62499836k FS
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd-fsck[623]: fsck.fat 4.1 (2017-01-24)
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd-fsck[623]: /dev/nvme0n1p1: 389 files, 36496/74752 clusters
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Started File System Check on /dev/disk/by-uuid/CCF3-E7D6.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Mounting /boot/efi...
Jun 07 17:08:34 test-Alienware-Aurora-R9 mount[669]: mount: /boot/efi: wrong fs type, bad option, bad superblock on /dev/nvme0n1p1, missing codepage or helper program, or other error.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: boot-efi.mount: Mount process exited, code=exited status=32
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: boot-efi.mount: Failed with result 'exit-code'.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Failed to mount /boot/efi.
Jun 07 17:08:34 test-Alienware-Aurora-R9 kernel: FAT-fs (nvme0n1p1): IO charset iso8859-1 not found
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Dependency failed for Local File Systems.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Dependency failed for Clean up any mess left by 0dns-up.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: dns-clean.service: Job dns-clean.service/start failed with result 'dependency'.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: local-fs.target: Job local-fs.target/start failed with result 'dependency'.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: local-fs.target: Triggering OnFailure= dependencies.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Starting Set console font and keymap...
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Started Stop ureadahead data collection 45s after completed startup.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Reached target Timers.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Closed Syslog Socket.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Started Emergency Shell.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Reached target Emergency Mode.
~~~


We are able to boot even after passing "nomodeset" in grub

This is a freshly installed machine, and tried with different kernels while switching from higher to lower. In all the case, same result. But 20.04 LTS no issues. 

I suspect it might be caused issue with hardware dependency? seems a bug with UEFI mode. 

~~~
cat /proc/scsi/scsi
Attached devices:
Host: scsi0 Channel: 00 Id: 00 Lun: 00
Vendor: ATA Model: ST2000DM008-2FR1 Rev: 0001
~~~

Please suggest

Moderator

 • 

25.1K Posts

June 8th, 2020 23:00

Hi,

 

Not sure if you tried the below steps,

 

Disable Fast Boot and Secure Boot (or Secure loader).

 

Plug in the bootable USB with the Linux distro (mine was Ubuntu 16.04)

 

When you see the loader to "Install Ubuntu" etc ... press "e" and edit a line: Replace "quiet splash" to "nomodeset" and press F10 to boot.

 

Then after the installation is complete, you will have to reboot. This time you will now encounter the GRUB. Again, press "e" and edit a line: In the line that starts with "linux", add "nouveau.modeset=0" at the end of that line. Your Linux should now boot.

 

After this, you need to install the nvidia drivers. Reboot. And then it's done.

 

- Rohit

 

 

June 9th, 2020 03:00

Also i managed to installed latest nvidia 440 version still no luck. In all the cases, machine booting into maintainance mode. 

unpacking Initramfs failed, suspecting this caused because of fast boot. let me try.

If this also fails, will proceed with legacy. 

June 9th, 2020 03:00

Hello Rohit, 

I'm able to pass the boot with 18.04 LTS, 

Below is the updated issue on my scenerio in dual boot for R9 workstation 

https://askubuntu.com/questions/1247085/dell-alienware-aurora-r9-compatability-issue-with-ubuntu-18-04-dual-boot

Tried with 18.04.4 ISO

Disabled secure boot

Fast boot  haven't done not sure this has any significance. 

After installation, first boot lands into maintainence mode, after further investigation on journal logs and found that the issue is with unable to mount /boot/efi mount point during boot. I beleive it's a bug with 18.04 with dual boot. 

At the same time 20.04 LTS, don't have any issues 

also reported this bug  https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1882195

Next plan is to try with legacy mode with dual boot. 

efi partition  causing the issue 

Please comment if you have any suggestions. 

Thanks

Jaya

No Events found!

Top