Ich hatte schon einmal einen Thread offen zum selben Thema, bin aber grandios gescheitert und habe danach Debian neu aufgesetzt - mein HDMI funzt noch immer nicht.
Hier mein System:
Code: Alles auswählen
inxi -F -z
System:
Host: paulian Kernel: 4.19.0-8-amd64 x86_64 bits: 64 Desktop: Gnome 3.30.2
Distro: Debian GNU/Linux 10 (buster)
Machine:
Type: Laptop System: Acer product: Aspire V3-772 v: V1.13 serial: <filter>
Mobo: Acer model: VA70_HW v: Type2 - Board Version serial: <filter>
UEFI: Insyde v: 1.13 date: 10/11/2013
CPU:
Topology: Quad Core model: Intel Core i7-4702MQ bits: 64 type: MT MCP
L2 cache: 6144 KiB
Speed: 2195 MHz min/max: 800/3200 MHz Core speeds (MHz): 1: 2195 2: 2196
3: 2195 4: 2195 5: 2195 6: 2201 7: 2195 8: 2196
Graphics:
Device-1: Intel 4th Gen Core Processor Integrated Graphics driver: i915
v: kernel
Device-2: NVIDIA GK106M [GeForce GTX 760M] driver: nvidia v: 418.74
Display: wayland server: X.Org 1.20.4 driver: nvidia
resolution: 1920x1080~60Hz, 1048x1680~60Hz
OpenGL: renderer: Mesa DRI Intel Haswell Mobile v: 4.5 Mesa 18.3.6
Audio:
Device-1: Intel Xeon E3-1200 v3/4th Gen Core Processor HD Audio
driver: snd_hda_intel
Device-2: Intel 8 Series/C220 Series High Definition Audio
driver: snd_hda_intel
Sound Server: ALSA v: k4.19.0-8-amd64
Network:
Device-1: Qualcomm Atheros AR9462 Wireless Network Adapter driver: ath9k
IF: wlp13s0 state: up mac: <filter>
Device-2: Broadcom Limited NetLink BCM57780 Gigabit Ethernet PCIe
driver: tg3
IF: enp14s0 state: down mac: <filter>
Drives:
Local Storage: total: 1.02 TiB used: 768.32 GiB (73.6%)
ID-1: /dev/sda vendor: Intel model: SSDMCEAC120B3A size: 111.79 GiB
ID-2: /dev/sdb vendor: Western Digital model: WD10JPVX-22JC3T0
size: 931.51 GiB
Partition:
ID-1: / size: 93.13 GiB used: 58.55 GiB (62.9%) fs: ext4 dev: /dev/dm-1
ID-2: /boot size: 236.3 MiB used: 133.5 MiB (56.5%) fs: ext2
dev: /dev/sda2
ID-3: swap-1 size: 15.88 GiB used: 0 KiB (0.0%) fs: swap dev: /dev/dm-2
Sensors:
System Temperatures: cpu: 42.0 C mobo: N/A
Fan Speeds (RPM): N/A
Info:
Processes: 309 Uptime: 2h 06m Memory: 15.55 GiB used: 2.18 GiB (14.0%)
Shell: bash inxi: 3.0.32
Code: Alles auswählen
inxi -r
Repos: Active apt repos in: /etc/apt/sources.list
1: deb http://deb.debian.org/debian/ buster main contrib main non-free
2: deb-src http://deb.debian.org/debian/ buster main contrib main non-free
3: deb http://security.debian.org/debian-security buster/updates main contrib non-free
4: deb-src http://security.debian.org/debian-security buster/updates main contrib non-free
5: deb http://deb.debian.org/debian/ buster-updates main contrib non-free
6: deb-src http://deb.debian.org/debian/ buster-updates main contrib non-free
Active apt repos in: /etc/apt/sources.list.d/dropbox.list
1: deb [arch=i386,amd64] http://linux.dropbox.com/debian buster main
Active apt repos in: /etc/apt/sources.list.d/opera-stable.list
1: deb https://deb.opera.com/opera-stable/ stable non-free #Opera Browser (final releases)
Code: Alles auswählen
"acpi_osi=! acpi_osi='!Windows 2009'"
Code: Alles auswählen
Section "ServerLayout"
Identifier "Layout0"
Option "AutoAddDevices" "false"
Option "AutoAddGPU" "false"
Screen 0 "nvidia"
Inactive "intel"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
VendorName "NVIDIA Corporation"
# If the X server does not automatically detect your VGA device,
# you can manually set it here.
# To get the BusID prop, run `lspci | egrep 'VGA|3D'` and input the data
# as you see in the commented example.
# This Setting may be needed in some platforms with more than one
# nvidia card, which may confuse the proprietary driver (e.g.,
# trying to take ownership of the wrong device). Also needed on Ubuntu 13.04.
# BusID "PCI:01:00:0"
# Setting ProbeAllGpus to false prevents the new proprietary driver
# instance spawned to try to control the integrated graphics card,
# which is already being managed outside bumblebee.
# This option doesn't hurt and it is required on platforms running
# more than one nvidia graphics card with the proprietary driver.
# (E.g. Macbook Pro pre-2010 with nVidia 9400M + 9600M GT).
# If this option is not set, the new Xorg may blacken the screen and
# render it unusable (unless you have some way to run killall Xorg).
Option "ProbeAllGpus" "false"
BusID "PCI:1:0:0"
Option "AllowEmptyInitialConfiguration"
Option "NoLogo" "true"
Option "UseEDID" "true"
# Option "UseDisplayDevice" "none"
EndSection
Section "Device"
Identifier "intel"
Driver "dummy"
BusID "PCI:0:2:0"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
EndSection
Code: Alles auswählen
Section "ServerLayout"
Identifier "Layout0"
Screen 0 "intel"
Inactive "nvidia"
EndSection
Section "Monitor"
Identifier "Monitor0"
Option "DPMS"
EndSection
Section "Device"
Identifier "nvidia"
Driver "dummy"
BusID "PCI:1:0:0"
EndSection
Section "Device"
Identifier "intel"
Driver "intel"
Option "TearFree" "true"
Option "DRI" "3"
BusID "PCI:0:2:0"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
EndSection
Section "Screen"
Identifier "intel"
Device "intel"
Monitor "Monitor0"
EndSection
Code: Alles auswählen
intel-virtual-output
Jetzt habe ich es in diesem Thread nochmal versucht und es folgendermaßen angepasst:
VirtualGL installiert und mich mit
Code: Alles auswählen
root@debian:/home/user# usermod -aG bumblebee user
Code: Alles auswählen
Section "Screen"
Identifier "Default Screen"
Device "DiscreteNvidia"
EndSection
Wayland habe ich wieder aktiviert.
Und hier die Outputs der Tests:
Code: Alles auswählen
vblank_mode=0 glxgears
ATTENTION: default value of option vblank_mode overridden by environment.
36914 frames in 5.0 seconds = 7382.692 FPS
37302 frames in 5.0 seconds = 7460.339 FPS
37290 frames in 5.0 seconds = 7457.952 FPS
37253 frames in 5.0 seconds = 7450.573 FPS
37210 frames in 5.0 seconds = 7441.860 FPS
37197 frames in 5.0 seconds = 7439.309 FPS
37413 frames in 5.0 seconds = 7482.465 FPS
XIO: fatal IO error 11 (Resource temporarily unavailable) on X server ":0"
after 21019 requests (159 known processed) with 0 events remaining.
Code: Alles auswählen
optirun -vv glxgears
[ 3157.934503] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[ 3157.935250] [DEBUG]optirun version 3.2.1 starting...
[ 3157.935274] [DEBUG]Active configuration:
[ 3157.935291] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 3157.935313] [DEBUG] X display: :8
[ 3157.935338] [DEBUG] LD_LIBRARY_PATH: /usr/lib/x86_64-linux-gnu/nvidia:/usr/lib/i386-linux-gnu/nvidia:/usr/lib/nvidia
[ 3157.935357] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 3157.935376] [DEBUG] Accel/display bridge: auto
[ 3157.935418] [DEBUG] VGL Compression: proxy
[ 3157.935439] [DEBUG] VGLrun extra options:
[ 3157.935452] [DEBUG] Primus LD Path: /usr/lib/x86_64-linux-gnu/primus:/usr/lib/i386-linux-gnu/primus:/usr/lib/primus:/usr/lib32/primus
[ 3157.936317] [DEBUG]Using auto-detected bridge virtualgl
[ 3158.674812] [INFO]Response: Yes. X is active.
[ 3158.674826] [INFO]Running application using virtualgl.
[ 3158.674957] [DEBUG]Process vglrun started, PID 5700.
9208 frames in 5.0 seconds = 1841.438 FPS
9860 frames in 5.0 seconds = 1971.946 FPS
9743 frames in 5.0 seconds = 1948.447 FPS
9832 frames in 5.0 seconds = 1966.328 FPS
9400 frames in 5.0 seconds = 1879.785 FPS
9172 frames in 5.0 seconds = 1834.266 FPS
[VGL] ERROR: in readback--
[VGL] 259: Window has been deleted by window manager
[ 3189.818542] [DEBUG]SIGCHILD received, but wait failed with No child processes
[ 3189.818577] [DEBUG]Socket closed.
[ 3189.818594] [DEBUG]Killing all remaining processes.
Code: Alles auswählen
optirun -v -b virtualgl -c jpeg glxgears
[ 3233.863863] [INFO]Response: Yes. X is active.
[ 3233.863882] [INFO]Running application using virtualgl.
[VGL] ERROR: Could not connect to VGL client. Make sure that vglclient is
[VGL] running and that either the DISPLAY or VGL_CLIENT environment
[VGL] variable points to the machine on which vglclient is running.
[VGL] ERROR: in connect--
[VGL] 282: Connection refused
Ich bedanke mich wie immer im Voraus und hoffe, ihr nutzt die Quarantäne so richtig schön aus!